Why GSA is Slow ?

edited December 2012 in Need Help

I am new to GSA, i had recently bought it, ran it for over a week now for 2 project with min PR1 , it only built 800links

I have the proxy harvest set for everything

I just want to ask why its slow? is that normal? i am running it off a vps with 1gps port, also i dont have dofollow only links checked, just to make it faster but still its slow

can someone help or advise, also would it hurt my site have alot of nofollow links?



  • edited December 2012 Accepted Answer
    I'm not exactly a pro with it yet, but I would look at the basics if I were in your shoes. I take it that your hardware and that your connection are both fine and public proxies will always slow down your ability to work - Private proxies are working well for me.

    How many keywords are you using? It appears, the more the better - I'm using quite a few with good results.

    How many search engines are you using? Just google? - I'm using virtually all of them but plan on reducing that to mostly English based regions. However using all of the SE's has been quick. Not xrumer fast, but much better than just using Google.

    How many platforms are you using? Are you using an appropriate number of threads? What are your expectations?

  • s4nt0ss4nt0s Houston, Texas
    edited December 2012 Accepted Answer
    Like GODOVERYOU said, public proxies will always slow you down.

    I use private proxies and can build over 800 links wtihin a few hours. Then again it also depends on your settings. We would need more information.

    Are you skipping captcha's? Using Captcha Sniper only? OBL? Bad words filters? What other project options are selected? Thread count? etc. etc.

    You didn't give us very much to work with.


    Thanks for the reply

    I am not using private proxies, i am just letting GSA harvest for proxies and then use it on everything

    I am using it on vps with a good resources and super fast connection, i am not using 50% of resources

    I am using all english search engines as well as over 50 keywords but still super slow in here

    any suggestoin?
  • I am using both Captcha sniper and deathbycapcha as a second option, bad words filters just for preventing link building on spam websites as well as porn website, other option selected is not less than PR1 for creating backlink and everything else is pretty much global settings

    I am using over 150 threads

    anything i should be checking?
  • s4nt0ss4nt0s Houston, Texas
    You can try lowering the time to wait between search engine queries. That option is right under where you configure your proxies.

    I've never tried it with public proxies so I'm not sure how that will work.

    Public proxies are going to be your problem though.
  • Probably proxies and/or your system are the bottleneck.

    I've been playing around with it and its a breeze at 300, 500 and even 1000 threads though I've not tested it for a long time... only around an hour or two but it ran fast without a problem.

    In that time it dropped around 3K links and verified 800 of them.

    PS - I don't recommend setting the thread count that high. I was only testing how fast GSA would go.

  • I have been using GSA for 4 or 5 months now and I have noticed my submission rates have really reduced. I moved over to a VPS and it was the same, so this week I completely rebuilt my computer and installed a fresh copy of GSA on the computer. 

    I then set up one project only. I have 10 private proxies, 50 threads, skip sites with a PR below 2. I am not using site lists just letting it scrape on its own. I have a bad word list. And have pretty much everything selected apart from pingback, indexer and referrer. I am only using CS. 

    The project has been running for 2 days now and I have 950 submissions and 34 verified. A few months ago when running identical projects I would be getting 1,000's of submissions. 

    Also even thou i have 50 threads the thread count stays on 1 for quite a while. I have unchecked the monitor my PC as well.

    My PC is an i7 3.4 GHZ with 8gb of ram. 

    As i have been writing this my thread count has rarely got above 2? I also had a look in task manager and the doesn't seem to be any problems with memory. Not sure what has changed?

  • ronron SERLists.com

    @micb11 - This is just my humble opinion, but a lot of people seem to get hung up on setting a minimum PR level. This greatly reduces the number of targets, and the number of links built.

    Again, I have been doing this a very long time and I do understand the rationale for high PR properties. All I am saying is that I do not use that setting, and I am crushing it on ranking some pretty competitive terms.

  • s4nt0ss4nt0s Houston, Texas
    edited December 2012
    @ron - That's interesting. I think I might try to test a few campaigns like that. What about OBL/Bad Words? I'd probably still make use of these filters. Are you running without them?
  • ronron SERLists.com

    @s4nt0s - I use my own bad word list which I probably got from you, hah :) But yes, I always use the bad words filter.

    I quit using the OBL filter also because *most* of the links had few OBL's anyway.

    I just think these two filters (PR and OBL) can strangle linkbuilding. Again, they were designed with SEO best practices in mind, but Google has gotten easier to rank since April 25th IMHO.

  • LeeGLeeG Eating your first bourne

    Im surprised people are saying they get poor submissions.

    This is a screen shot of one of my better days recently. Easily hit with the right settings

    People I have coached get 40k submissions a day



    That was until versions 4.76 and 4.77 were released

  • edited December 2012
    I've been using it for 7 months now. I do all my own scraping and feed it into SER. It's much more efficient for me. Private proxies are a must. Screen shot below is from today

  • ronron SERLists.com

    @LeeG - That's pretty impressive. I am using 30 private proxies but only 100 threads, and getting about 25,000 submitted - without feeding SER any lists. So it's proportional, but you are definitely pushing more threads on those 38 proxies.

    Are you feeding it lists, or is SER doing this volume finding its own targets?

  • LeeGLeeG Eating your first bourne

    There are tweaks that can be done. There has been a lot of testing along the way.

    Working out what takes time and can be dropped, killing of the myths that keep getting repeated as gospel.

    The only captcha service used is captcha sniper to get those results


    Basics behind how those numbers are hit

    Using an external captcha service adds time to posting. Captcha sniper, you do below two seconds per captcha. External services take time, over fifteen seconds has been seen when one service is busy.

    Add extra captchas and your hitting more sites. Some can take seconds to add, others can take a bit of messing around

    Captcha Sniper retries, set to zero. Dont bother with multiple retries. I jumped from a 40k average to 60k, just with that one setting.

    Time between search engine queries. Add enough engines, but not all google clones and gsa lets you search without proxies and reduce the time between searches. This takes out one period when proxies are used. A direct connection to a search engine is quicker than going through a proxy. And I have never been blocked by an engine.

    I add lists, but only things like wikis, pliggs, article directories. I never feed it blog lists. Any blog lists shared or sold on the net, imo soon become spam havens


    The only doubt I have now is the recent change to the scheduler and the random posting. Prior to that change, you could evenly spread your daily links. Now thats been taken away

  • LeeGLeeG Eating your first bourne

    Another tweak I forgot about is the html timeout

    Mine is set to 130, it will go lower when I tweak it again.

    Set that to max, then work down when its running the link checking

    You will find a sweet spot, when you get a lot of verified and few page timeouts

    Set it too high and a lot of time is wasted waiting to be told a link dont exist or a site is down

    Speed of proxies can affect that time setting


    Spec on my vps, is widows server 2008 r2, 6gig of ram, 60 gig hard drive, 100mbt connection.

    GSA, CSX, anti virus and my operating system, Im just over 20 gig used on the hard drive

    GSA dont go much above 10mbt under normal running. Normally averages about 6mbt

    If you shop around, you can get some good deals on a vps.

  • ronron SERLists.com
    edited December 2012

    @LeeG - I quit using DBC as a backup months and months ago because it was racking up too much cost.

    That is excellent advice about the captcha retries set at zero. After all, we are not looking for perfection in a task - we are looking for cranking out the highest volume given our parameters.

    Challenging the status quo - that is exactly what I was saying above. I don't get caught up in things like PR and OBL's because they really put a drag on the linkbuilding.

    Here's my settings:

    • 120 second html timeout
    • 3MB limit on website size (which by the way I think a lot of people need to look at because this can really dog your efficiency)
    • 10 seconds between search engine queries
    • 30 semi-dedicated private proxies (buyproxies)
    • Use a bad word list
    • No OBL filters
    • No PR filters
    • Home PC - AMD quadcore, 64 bit, 3.2 Ghz, 8 Gb Ram, 2 1TB harddrives, Kaspersky AV, 36Mbps.

    I think we are roughly on the same page, but that is great advice on getting rid of the CSX retries - in fact, I just set it to zero. I'm also going to bump up my threads to 150 to see how SER handles it.

    Let me know what your wait time is between search engine queries..I'm interested to see where you are at with that.

    Also let me know how many search engines you use. I have mine set to all 156 English search engines. I have to admit, I never experimented with that setting.

    I'm probably going to a dedi with powerup. I'm at 30 projects right now, and I know that I'm going to dog my home PC if I keep adding projects.

    It's nice to know what others are doing, and I'm sure this discussion will help others.

    Thanks dude!



  • Thanks alot for all the help here, i have been reading some cool stuff

    I am new to GSA but i have to say for Tier 2, Tier 3, this is quick

    my only problem was only for generating high quality relevant PR2 and above on autopilot and thats where the slow part i am facing

    Its been 2 days now and not 1k however this afternoon started tier2 and tier 3 for another campaign and its over 5k now just for tier 2, 8k for tier 3

    i have been reading some are saying that you can import extra sites to gsa? how is that possible? sorry i am new to gsa still
  • @LeeG - Thanks for the tip on CS. Going to try that... I had it set at 6 retries followed by DBC (based on the recommendations for CS I read somewhere).

    @ron - You are right about OBL and PR throttling the number of links. Having only a few links in 24 hours since I went with PR3 (subdomain) on a new site. Otherwise I have had over 2K links verified overnight at a moderate thread count of 20-30.

    Personally, I don't use proxies for scraping (I select the 37 search engines shared by Oz in another thread). I do use proxies to post (20 dedicated from Buyproxies.org) and I'm still experimenting with the thread count. I've tried at 100, 300, 500 and even 1000 but that's mostly for the tiers 1 to 3. For the money site I prefer to take things slow and use only 20-30 threads to spread out the link building and limit to only around 30-50 links per day (using PR and OBL filters and truth be told GSA does struggle to get those 30-50 links when using only a couple of platforms). I guess I'm going to take a page out of ron's book and disable these filters and see how it goes.

    @zalouma - Use ScrapeBox to scrape a ton of links (I started with around 300,000) and then right click on the project and import target urls from file/clipboard. If you are asking about adding new sites and platforms... sorry I cannot help. I'm not a geek when it comes to those things (still having my ZennoPoster lying idle).
  • ronron SERLists.com
    edited December 2012

    @tumpien - When you mentioned lowering the threads for the moneysite, that really is not necessary. You put a smaller limit on the links for the homepage, and in essence, that is your throttle. The thread count is global across all projects, so you don't need to adjust that on a per project basis. I also use private proxies for everything in SER. Essentially, it just runs. I don't stop it other than to update, and I do make an effort to check all emails for blacklisting no longer than once every 2 days.

    Ozz did a fantastic writeup that you should bookmark and probably print out for your future strategy:


    Ozz should have charged everyone money for it. It will seriously help your rankings. I have to admit I was already doing something very similar, and I can attest that it works. It really should be stickied, and wish @Sven would do that.

    Please do me a favor and paste the thread where Ozz talks about the 37 search engines. I am interested to see what was discussed there.

    @indylinks - I forgot to say to you that those stats of yours are fantastic. It's exciting to see that many verified (as a % of submitted) in a run, and it's proof how valuable importing lists are.


  • AlexRAlexR Cape Town
    @LeeG - how many threads are actually running normally? What size do you have website download size set to? 
  • @ron - Thread by Ozz about what search engines to use. Also, has a mediafire link that you can use to select search engines for projects.

  • @ron. Wow lots of comments, great stuff. I will try your settings to see how that goes and report back on this thread. 

    Back to my original comment was that I was using the identical settings a few months ago and was getting 3K submissions a day but now not so many.

    Going to get me some more proxies and see how it goes

  • LeeGLeeG Eating your first bourne

    I run 13 search engines

    Even that I consider too high at times.

    A lot of the engines are google spin offs

    It dont matter what you call a chicken, its still a chicken

    If you ever track your results, Google UK, Google US etc most of your results are with in one or two places of each other. Why pull the same results time and time again and waste time that can be better spent posting links

  • Thanks for some great tips everyone.
  • ronron SERLists.com
    edited December 2012

    @LeeG - But what is your timeout setting between search engine queries? I think that is important.

    @tumpien - Thanks for the share on the 37 search engines. I do not know how I missed that...

    @Ozz - Great work once again!

  • LeeGLeeG Eating your first bourne

    10 second timeout between queries.

    Im only running 13 search engines at present.


    Most are just the same chicken with a fancy name

    All you end up with are the same results, so why waste time trying to post to the same old same old with the "url already parsed" message

    I run a vps to to build links, not keeping adding traffic to blogs

    Bandwidth costs money, why waste it

  • LeeGLeeG Eating your first bourne

    Here ya go, a screen shot to save anyone getting confused


  • ronron SERLists.com
    @LeeG - Trust me, the light bulb went on. You are so correct on the duplication effort of SER, and the resulting drag on efficiency. It would be like searching both Google and AOL, where AOL is already powered by Google. Thanks for the confirmation on the wait time between SE queries - I'm the same.
  • LeeGLeeG Eating your first bourne
    Its not a tried and tested number, just one I liked the look of :)
  • nice community sharing make SER a much more powerful tool
Sign In or Register to comment.