Each page has an article relevant to what it is about, I've programmed something to make about 5000 articles with unnoticeable generated Spintax, they're sorted by the latest first, made to be imported into GSA, It took almost 2 hours to import the articles folder into GSA though, otherwise I would have done a lot more. 5000 with the Spintax, I'm guessing around 100,000 unique articles.
I've only used GSA indexer, but I see backlinks went from like 30 to 300 today on Ahrefs, so the indexer must be working?
I think you should scrap that site and start over. If you were able to afford the processing power to run enough copies of SER to rank over 100,000 keywords quickly then you wouldent need to do SEO to make money, your banks interest would keep you fine :P.
Maybe change your stratergy slightly and rather than trying to rank 100,000 pages only try to rank one for one specific keyword. Then you can put 100% effort into that one keyword.
It really depends on your personal risk tolerance. I don't believe any SEO efforts is worth the risk of causing irreparable damage to any serious money site. I use aged PBN's for tier 1 and GSA for tier 2 & 3. I'll hit each t1 url with around 3 t2's a day, and hit each t2 with around 10-15 t3 a day. I'll drip these projects over 3-4 weeks.
DIAMONDHOTRIMZ1 My list is highly refined made up of my own scraping and two premium lists that I get for free.
ronniej556 Think of all the premium list sellers, this of how many servers they have running scraping 24/7. There is always going to be a lot of over lap in the lists. As I said above I have two free subscriptions to two different list services a while back I compared the two and there was around 60% duplication between the two lists.
Also someone using the link extraction method to get target URLs will be able to find a fair amount of your list from a single hit and working backwards.
I have tried a fair few list providers now and they are all packed out with crap platforms....thats why I run them through my own process every few weeks take the good and just scrape what I want and process my own stuff.
Self scraping is really the only route to go now. I think SERocket adds like 3 ursl to their lists per day, i always get "already parse" when i import their lists every day.
The older the domain, the more resilience you'll have to play with.
Another factor is how are you indexing these links, if you're not actively indexing them, then building those numbers in that time frame should be fine.
Keyword variations, how many keyword anchors are you using? Sometimes people think it is 100% to do with link velocity, but most of the time it is actually your anchor ratios being to high with such output.
Comments
5000 with the Spintax, I'm guessing around 100,000 unique articles.
SEO is a game of patience.
Probably SERocket and SERlists
Both of which are like 70% Joomla k2
Another factor is how are you indexing these links, if you're not actively indexing them, then building those numbers in that time frame should be fine.
Keyword variations, how many keyword anchors are you using? Sometimes people think it is 100% to do with link velocity, but most of the time it is actually your anchor ratios being to high with such output.
Are you tiering your links?