Skip to content

GScraper URL List Stops

edited July 2014 in Other / Mixed
Hi Everyone

I have gscraper on a vps and on a local machine to test scraping lists for GSA but it's stopping when it shouldnt

I get a scrape say for example, a url list of 40,000 and then it stops but it says the scrape is working in the status bar at the bottom

Even if i use a few or hundreds of footprinters, the URL list gets to a number and then doesnt go any further

Does this mean its not finding any more targets or the softwares not working?

I thought people were having gscraper run for hours, days etc.. but my scrapes are very quick before they stop

Many Thanks for anyone who could shed some light



  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    If it's not scraping any new urls, I would think that the proxies your using for scraping are blocked on Google.
  • okay. Thanks Trevor, will investigate that side of things more



  • Proxies seem fine, loaded random ones into IE , enter the username and password and work. Tested on scrapebox

    Will keep trying to think of anything else but if anyone else has any ideas id be very grateful

  • Post here screenshots of your settings. How many threads you use, how many proxies? You use private proxies for harvesting? 
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Test the proxies in Gscraper and see how many are left after testing.

  • Ah.. all but 3-4 of the proxies get deleted (thanks Trevor) but satyr85 to of course

    Im running 20 dedicated from Buyproxies, never had an issue with scrapebox or GSA

    How many threads would you recommend for 20 dedicated? .. very surprised they burnt out quick

    Many Thanks

  • satyr85 -  I just found a comment you made on another forum about a good public proxy service

    What do you recommend?

    Many Thanks

  • Its not possible to use private proxies in Gscraper for harvesting. You would need high XXX or low XXXX private proxies. GScraper is very fast harvester and it ban proxies very fast. Proxy service i was talking about was my own proxy service. You can read more about alpha test of this service here.
  • Excellent, Thanks Satyr85 - As I was used to scrapebox, i didnt realise how fast gscraper was and how it could burn them out so fast. Now i know.. especially reading a dedi would be best

    I will check out your link now :) Glad I have learnt from you both, it was so obvious but driving me nuts :)

  • @robstone74 You ever think of scraping public proxies with GSA then saving them to a file and importing them to GScraper?

    I wonder if that would be a viable solution instead of scraping with private proxies. I read you're not supposed to do that because searching with footprints will cause Google to ban / block your IP?
  • silverdot 
    I have nice bot - its able to test 10-15k harvested proxies per minute in google + its able to harvest may more. With this custom software (i was paying $50 per month for server alone) i didnt get enough proxies to scrape with satisfying speed. SER have ~ 5-10% power of this software (when it comes to proxies harvesting and testing) so... no its not an option.
Sign In or Register to comment.