Skip to content

How good is GSA without verified lists for powering up 20+ Web2.0's?

I have used it a few months ago for a few days but didnt manage to get more than like 10-20 verified links in 48hours +

Not sure what I was doing wrong but the verified ratio was really really bad and so was the LPM.


  • edited May 2018
    u can use PSR engines
    over 800 contextual engine
  • Not looking to spend extra money on engines / lists
  • The only real weakness of the ser system is not on GSA itself it is on Google's ability to stop scrapers. 

    There are a few options to make ser an absolute beast, if you do not want to spend any money on ser don't. 

    No matter what you are doing you will need a Scrape proxy solution.

    Personally I do not Scrape with ser, I have hrefer scrape on a separate server, hrefer saves its output every 24 hours.

    Platform identifier then runs once each 24 hours, dedupes, identifies and also runs with blacklist enabled.

    GSA has a powerful public proxy scraper, I prefer the stand alone GSA proxy scraper which again runs on a separate server and outputs to text file on FTP, which all GSA products then use as public proxy source. That too updates every 24 hours.

    Yup could do all of these things on local machines, you can do them on cheap VPS, however none of these options are free.

    I think the question should be, not how good is the tool but more how good are you at working out a creative solution to get it running.

    I can get a single instance of ser running at 400 lpm with no email solution or proxy solution and no paid list.

  • shaunshaun
    Depending on the engines you had selected 10-20 verified in 48 hours could actually be a good ratio.
Sign In or Register to comment.