@Doncorleone: Me it does not bother me, just trying to hang too did not leave so that everyone gets are gift (I have deleted on you wall and send be Pm).
I think mine are going to be available today (will be unlocked for those who have finished.)
I just create an account on Mega, everything will be listed. Only the worker will have their pass.
Seems interesting and clever plan from your side, but what is the use of this since GSA can only post to, let's say 5 %, of the web sites you get ? You will receive tons of cms which have no commenting enabled at all, tons of static html sites, tons of moderate posting sites, tons of various platforms and dashboard which have no use for backlinks at all, etc. I mean if you are willing to process few hundreds of millions links through GSA you may get 1 million unique URLs where GSA can post to, maybe, but it will take a lot of work and resources to get it done.
You are right, however SER does not linger if he does not know or can not post on the site.
"No engine found".
It'll also help to scrape in a different way and probably to highlight sites that we can not currently find.
Everyone wants to scrape site as Ahrefs, moz or other database and we bypass the problem and will do what nobody else has done to my knowledge. In addition we combine our resources.
If you have it available send me over one of the list and i will scrape it with gscrapper i would also like to see the results in GSA with scrapped list.
This is only available for those who have finished their job. Thank you not to rush the job. I demand serious in this work together as we receive the work of others. In this connection made careful to select the language for your list.
The original list is 2.24g X 2 and processes .Com
Let's list 1 and 2.
List 1 is therefore 2.24g cut by one million lines.
This gives us 13 189 files> 190MB!
Why is the file so big?
Because request sent to Google is made to return to the maximum cuted TLD domain. Thus we have more income and less duplication.
At this time i have already finished domain.FR (to unlock for those who finish their job).
I start now first file "GScraper-Split-1.txt"
I sent a PM to all workers with my skype address for this project.
I'm looking for the best way to centralize our job.
And each member has done its part, can only access a download all lists.
I speak with justin and the best way seems to be access via IP.
vort3x 'll probably set up a chat or we can all share our point of views simultaneously.
Comments
Please let me know details.
The best is yet to come.