Skip to content

Some Of My Observations From Testing GSA SER

edited September 2012 in Other / Mixed
This is kind of long...I apologize, but maybe it will help answer some peoples questions.

This is not a question for anyone to answer. It is just some of my "observations" while running some tests. I am a new user of GSA SER. Being such, before I really started to "load it up", I wanted to get a better feel of what it could do and what I could expect as far as results.

That being said, I hope that everyone using this program understands how complex all the processes are that are being done and how many variables are involved. Just to be sure, here are some of them...

- # of projects
- # of types of sites selected
- # of URLs you are processing
- # of search engines you selected
- # of keywords you entered
- "popularity" (or lack of) for the keywords you entered
- PR of sites you selected
- Processor power of PC you are running on
- RAM of PC you are running on
- Speed of your internet connection

...and there are many, many more. The point being, for someone to post a question like "Why am I not getting more links posted?" is almost an unanswerable question. I admire the support that I have seen given to such vague questions! (Thanks guys!) To those that are asking these questions, this program is a tool that you have some responsibility in learning how to use. If you just bought a car, would you ask "How come my car isn't going anywhere?"...just sayin'

Now, since I wanted to see what kind of results I could get, I tried to eliminate (or reduce) as many of the variables as I could. I wanted to produce as many links as I could. So here are the settings I used.

- I had five projects running.
- I selected all site types except for Indexer/Pingback/Referrer (for every project).
- I selected "Skip sites with more than 200 outgoing links" (pretty high)
- I ignored PR (would get more links posted)
- I didn't limit the number that could be posted during any time frame
- I allowed posting to the same domain
- I used URLs from Global List
- I selected search engines from USA/Canada/Australia/United Kingdom
- I used the "Bad Word" list provided by GSA
- I created highly spun articles/descriptions/etc to be used during posting
- I submitted backlinks to search engines (check box)
- I submitted backlinks to GSA SEO Indexer (only indexing method used)
- I ran with Captcha Sniper 7.77 as only captcha solving agent
    (*Note - Using DeathByCaptcha or others would have increased results, I am sure.)
- I ran with 20 private (fast) proxies. (I used them "Everywhere".)
- HTML timeout set to 90 seconds.
- Used default (60 seconds) wait between S/E queries.
- # threads ---> see next section
- Internet speed (tested) - 12-17 Mbps (download), 3-5 Mbps (upload)

The biggest variable in my testing was the PC that it was running on. I ran for the first few days on my laptop (Dual-core AMD processor, 3GB RAM, Windows 7). I ran for the next few days on my desktop (Quad-core AMD, 8GB RAM, Vista 64-bit).

While running on the laptop, I ran 25 threads. (If I ran more than this, it started to affect the performance of my using it for other tasks). While running on the desktop, I started at 50 threads and changed to 70 and then 90 (then back down). (BTW - I was never able to bog this PC down no matter how many threads I was running.) While there was some increase each time I increased the number of threads, it definitely was not linear. As others have noted/questioned in the forum, there are times that the program does not use the max number of threads. Like many others, I would like a better explanation of how the threads work within the program. (I posted this question/suggestion in another discussion - )

As far as results, I can say beyond any doubt, the ABSOLUTE BEST THING you can do to extract the most out of this program is to run it on a PC that can handle it. I got two to three times the results with the desktop setup as I did with the laptop.

For those that are using (or contemplating using) a VPS, I would expect results to be even higher. Mainly because of the high speed internet connection.

In raw numbers, I was getting 300-400 verified links/day while running on my laptop. I am now getting 800-1200 per day on the desktop.

One other thing that I did notice that is DEFINITELY WORTH MENTIONING...there were hours at a time where results would drop below these numbers (sometimes significantly). I am fairly certain that this was during RANDOM TIMES when the program just happened to be processing some search queries that were not producing as many results. This is going to happen and should be expected. It is not a problem or bug with the program (or any of the setup options for that matter). Searches for some keywords/platforms are not going to produce as many results as others.

Also, if you try to "transpose" any of this information (and results) to your projects, remember that your options will be different than mine (and anyone elses), so your results will be different. And remember, that I was trying to get as many links as possible. With "normal" settings, I would probably get lower results.

I tried to give as much detail as help some of the "newbies" and those more "technically challenged" understand this whole process better (not that I fully understand it yet). I hope it does that.


  • s4nt0ss4nt0s Houston, Texas
    edited September 2012
    Wow. What a fantastic post! That's awesome you took the time to do that and write everything out. I'm glad some people understand how many variables play a role in the amount of links and success rate, etc. and that's why its so hard to answer those kind of questions.

    You just have to educate yourself on how the software works and then its a lot easier to unlock its potential.

    Every newbie should take the time to read this post. I wish everyone did thorough testing like you. Great share!

    Oh ya and I'm sure you'd see an increase in success rate if you did have a backup service. Glad you're doing pretty good with it so far.
  • Welcome to the community DavidA2. I'm a little bit in hurry, but one big improvement is the search-time between the queries. I think with 20 private proxies you could go down to 20 without any risk. 
  • Thanks...I'll give that a whirl for sure.
  • AlexRAlexR Cape Town
    I am using 20 private proxies and running at a 4s SE time interval, also try reducing the website size download to 2mb. This will help as well. 
Sign In or Register to comment.