Slow and steady, the turtle wins the race...or is it the quick hare that gets the gold?!?!?!?
OK. I see that if I slow down, I won't overflow my allotted CaptchaTronix buffer.
So...that saves me money on solves.
There seem to be specific advantages to running slowly or lightning-fast.
I guess if you're at a point where all your projects and sub-projects' link demands (HUUUUNGRY!!! lol) means you have to get those super-high VPM numbers, I get it. It's essential you not waste a minute in that 24 hour period, or any processing resources.
But I'm not at that point, just yet. Probably, not for a bit.
So, leaving GSA on with way fewer threads, also spreads out the links, time-wise. For sites crawled by a/the particular SE ur trying to help recognize the worth of your brand, it then (potentially) sees these new links throughout the day.
Just a thought. For now I think slowing down makes sense in my case. Until it doesn't any longer? lol I think that's the goal of a place you want to reach.
Therefore, I'm also checking out the Scheduler, which I've never used. (I mostly rely on Select by Mask, done manually)
And...
I guess at some point the answer is to order a higher-thread package, and slowly work up to prevent overflow, and only so much added headroom that you're very nearly up to it only at peak operations, maybe with ~15%(?) to spare? Keep repeating this process until your needs outgrow each higher-rung package of threads? Kind of like re-potting a houseplant in progressively larger pots as it grows.
So...that saves me money on solves.
There seem to be specific advantages to running slowly or lightning-fast.
I guess if you're at a point where all your projects and sub-projects' link demands (HUUUUNGRY!!! lol) means you have to get those super-high VPM numbers, I get it. It's essential you not waste a minute in that 24 hour period, or any processing resources.
But I'm not at that point, just yet. Probably, not for a bit.
So, leaving GSA on with way fewer threads, also spreads out the links, time-wise. For sites crawled by a/the particular SE ur trying to help recognize the worth of your brand, it then (potentially) sees these new links throughout the day.
Just a thought. For now I think slowing down makes sense in my case. Until it doesn't any longer? lol I think that's the goal of a place you want to reach.
Therefore, I'm also checking out the Scheduler, which I've never used. (I mostly rely on Select by Mask, done manually)
And...
I guess at some point the answer is to order a higher-thread package, and slowly work up to prevent overflow, and only so much added headroom that you're very nearly up to it only at peak operations, maybe with ~15%(?) to spare? Keep repeating this process until your needs outgrow each higher-rung package of threads? Kind of like re-potting a houseplant in progressively larger pots as it grows.
Comments
Set two burner projects up set to some made up URL, set one to only use GSA CB and one to only use Captcha Tronix and see if theres any difference in URL output between the two.
It seemed OK for mid to low quality links, non-contextuals of all types.
The new GSA-SER CAPTCHA tracking feature shows it's solving a fair amount of (I will admit) rather junkier link types, maybe? lol But I still have a need for these.
I don't yet own a license of GSA-CB, Shaun. Once I do, I'll try this. Thank you, kindly, once again for the heads-up on this.
I have a project where it took me 1 day to build links, very targeted manual link building. I scraped targets with scrapebox. I still get traffic from these links ( 6 years ago) and sales. While all of my "link building" campaigns are mostly dead, domains penalized, spam disappeared. The old domain still has an authority and keeps ranking.
Do you know the rule 80/20? I believe that 20% is picking the right product, making a website that is good for the users, then 80% is driving traffic. So think about it once you think that your only chance is more links and more tiers.