Why GSA is Slow ?
Hi
I am new to GSA, i had recently bought it, ran it for over a week now for 2 project with min PR1 , it only built 800links
I have the proxy harvest set for everything
I just want to ask why its slow? is that normal? i am running it off a vps with 1gps port, also i dont have dofollow only links checked, just to make it faster but still its slow
can someone help or advise, also would it hurt my site have alot of nofollow links?
Thanks
I am new to GSA, i had recently bought it, ran it for over a week now for 2 project with min PR1 , it only built 800links
I have the proxy harvest set for everything
I just want to ask why its slow? is that normal? i am running it off a vps with 1gps port, also i dont have dofollow only links checked, just to make it faster but still its slow
can someone help or advise, also would it hurt my site have alot of nofollow links?
Thanks
Comments
How many keywords are you using? It appears, the more the better - I'm using quite a few with good results.
How many search engines are you using? Just google? - I'm using virtually all of them but plan on reducing that to mostly English based regions. However using all of the SE's has been quick. Not xrumer fast, but much better than just using Google.
How many platforms are you using? Are you using an appropriate number of threads? What are your expectations?
I use private proxies and can build over 800 links wtihin a few hours. Then again it also depends on your settings. We would need more information.
Are you skipping captcha's? Using Captcha Sniper only? OBL? Bad words filters? What other project options are selected? Thread count? etc. etc.
You didn't give us very much to work with.
Thanks for the reply
I am not using private proxies, i am just letting GSA harvest for proxies and then use it on everything
I am using it on vps with a good resources and super fast connection, i am not using 50% of resources
I am using all english search engines as well as over 50 keywords but still super slow in here
any suggestoin?
I am using over 150 threads
anything i should be checking?
I've never tried it with public proxies so I'm not sure how that will work.
Public proxies are going to be your problem though.
I've been playing around with it and its a breeze at 300, 500 and even 1000 threads though I've not tested it for a long time... only around an hour or two but it ran fast without a problem.
In that time it dropped around 3K links and verified 800 of them.
PS - I don't recommend setting the thread count that high. I was only testing how fast GSA would go.
@micb11 - This is just my humble opinion, but a lot of people seem to get hung up on setting a minimum PR level. This greatly reduces the number of targets, and the number of links built.
Again, I have been doing this a very long time and I do understand the rationale for high PR properties. All I am saying is that I do not use that setting, and I am crushing it on ranking some pretty competitive terms.
@s4nt0s - I use my own bad word list which I probably got from you, hah But yes, I always use the bad words filter.
I quit using the OBL filter also because *most* of the links had few OBL's anyway.
I just think these two filters (PR and OBL) can strangle linkbuilding. Again, they were designed with SEO best practices in mind, but Google has gotten easier to rank since April 25th IMHO.
Im surprised people are saying they get poor submissions.
This is a screen shot of one of my better days recently. Easily hit with the right settings
People I have coached get 40k submissions a day
That was until versions 4.76 and 4.77 were released
@LeeG - That's pretty impressive. I am using 30 private proxies but only 100 threads, and getting about 25,000 submitted - without feeding SER any lists. So it's proportional, but you are definitely pushing more threads on those 38 proxies.
Are you feeding it lists, or is SER doing this volume finding its own targets?
There are tweaks that can be done. There has been a lot of testing along the way.
Working out what takes time and can be dropped, killing of the myths that keep getting repeated as gospel.
The only captcha service used is captcha sniper to get those results
Basics behind how those numbers are hit
Using an external captcha service adds time to posting. Captcha sniper, you do below two seconds per captcha. External services take time, over fifteen seconds has been seen when one service is busy.
Add extra captchas and your hitting more sites. Some can take seconds to add, others can take a bit of messing around
Captcha Sniper retries, set to zero. Dont bother with multiple retries. I jumped from a 40k average to 60k, just with that one setting.
Time between search engine queries. Add enough engines, but not all google clones and gsa lets you search without proxies and reduce the time between searches. This takes out one period when proxies are used. A direct connection to a search engine is quicker than going through a proxy. And I have never been blocked by an engine.
I add lists, but only things like wikis, pliggs, article directories. I never feed it blog lists. Any blog lists shared or sold on the net, imo soon become spam havens
The only doubt I have now is the recent change to the scheduler and the random posting. Prior to that change, you could evenly spread your daily links. Now thats been taken away
Another tweak I forgot about is the html timeout
Mine is set to 130, it will go lower when I tweak it again.
Set that to max, then work down when its running the link checking
You will find a sweet spot, when you get a lot of verified and few page timeouts
Set it too high and a lot of time is wasted waiting to be told a link dont exist or a site is down
Speed of proxies can affect that time setting
Spec on my vps, is widows server 2008 r2, 6gig of ram, 60 gig hard drive, 100mbt connection.
GSA, CSX, anti virus and my operating system, Im just over 20 gig used on the hard drive
GSA dont go much above 10mbt under normal running. Normally averages about 6mbt
If you shop around, you can get some good deals on a vps.
@LeeG - I quit using DBC as a backup months and months ago because it was racking up too much cost.
That is excellent advice about the captcha retries set at zero. After all, we are not looking for perfection in a task - we are looking for cranking out the highest volume given our parameters.
Challenging the status quo - that is exactly what I was saying above. I don't get caught up in things like PR and OBL's because they really put a drag on the linkbuilding.
Here's my settings:
I think we are roughly on the same page, but that is great advice on getting rid of the CSX retries - in fact, I just set it to zero. I'm also going to bump up my threads to 150 to see how SER handles it.
Let me know what your wait time is between search engine queries..I'm interested to see where you are at with that.
Also let me know how many search engines you use. I have mine set to all 156 English search engines. I have to admit, I never experimented with that setting.
I'm probably going to a dedi with powerup. I'm at 30 projects right now, and I know that I'm going to dog my home PC if I keep adding projects.
It's nice to know what others are doing, and I'm sure this discussion will help others.
Thanks dude!
I am new to GSA but i have to say for Tier 2, Tier 3, this is quick
my only problem was only for generating high quality relevant PR2 and above on autopilot and thats where the slow part i am facing
Its been 2 days now and not 1k however this afternoon started tier2 and tier 3 for another campaign and its over 5k now just for tier 2, 8k for tier 3
i have been reading some are saying that you can import extra sites to gsa? how is that possible? sorry i am new to gsa still
@ron - You are right about OBL and PR throttling the number of links. Having only a few links in 24 hours since I went with PR3 (subdomain) on a new site. Otherwise I have had over 2K links verified overnight at a moderate thread count of 20-30.
Personally, I don't use proxies for scraping (I select the 37 search engines shared by Oz in another thread). I do use proxies to post (20 dedicated from Buyproxies.org) and I'm still experimenting with the thread count. I've tried at 100, 300, 500 and even 1000 but that's mostly for the tiers 1 to 3. For the money site I prefer to take things slow and use only 20-30 threads to spread out the link building and limit to only around 30-50 links per day (using PR and OBL filters and truth be told GSA does struggle to get those 30-50 links when using only a couple of platforms). I guess I'm going to take a page out of ron's book and disable these filters and see how it goes.
@zalouma - Use ScrapeBox to scrape a ton of links (I started with around 300,000) and then right click on the project and import target urls from file/clipboard. If you are asking about adding new sites and platforms... sorry I cannot help. I'm not a geek when it comes to those things (still having my ZennoPoster lying idle).
@tumpien - When you mentioned lowering the threads for the moneysite, that really is not necessary. You put a smaller limit on the links for the homepage, and in essence, that is your throttle. The thread count is global across all projects, so you don't need to adjust that on a per project basis. I also use private proxies for everything in SER. Essentially, it just runs. I don't stop it other than to update, and I do make an effort to check all emails for blacklisting no longer than once every 2 days.
Ozz did a fantastic writeup that you should bookmark and probably print out for your future strategy:
https://forum.gsa-online.de/discussion/879/guide-high-quality-campaign#Item_3
Ozz should have charged everyone money for it. It will seriously help your rankings. I have to admit I was already doing something very similar, and I can attest that it works. It really should be stickied, and wish @Sven would do that.
Please do me a favor and paste the thread where Ozz talks about the 37 search engines. I am interested to see what was discussed there.
@indylinks - I forgot to say to you that those stats of yours are fantastic. It's exciting to see that many verified (as a % of submitted) in a run, and it's proof how valuable importing lists are.
https://forum.gsa-online.de/discussion/79/the-real-search-engine-optimisation-thread/p1
I run 13 search engines
Even that I consider too high at times.
A lot of the engines are google spin offs
It dont matter what you call a chicken, its still a chicken
If you ever track your results, Google UK, Google US etc most of your results are with in one or two places of each other. Why pull the same results time and time again and waste time that can be better spent posting links
@LeeG - But what is your timeout setting between search engine queries? I think that is important.
@tumpien - Thanks for the share on the 37 search engines. I do not know how I missed that...
@Ozz - Great work once again!
10 second timeout between queries.
Im only running 13 search engines at present.
Most are just the same chicken with a fancy name
All you end up with are the same results, so why waste time trying to post to the same old same old with the "url already parsed" message
I run a vps to to build links, not keeping adding traffic to blogs
Bandwidth costs money, why waste it
Here ya go, a screen shot to save anyone getting confused