Scalability of tiered link building
I'm calling @LeeG @ron and others who have experience in using GSA SER for large scale tiered link building and/or maximizing LPM. But everyone feel free to chime in.
Now, I've been using a similar setup, inspired by @Ozz here:
I still struggle to get over 35 LPM constantly, but that was not the issue here. I could read other threads for that. There are ample of information here already.
Every day, looking at my stats, I always feel my setup fails to build enough links to satisfy each tier. Even when I only use three tiers instead of what Ozz and ron use (4 tiers).
Until today when I read a thread started by ron (You can find it here https://forum.gsa-online.de/discussion/2930/ser-tiers), it came to me perhaps I should know the figures better, so I fired up Excel and did quick formula as follow. I'm never good at math, so correct me if I'm wrong.
The above screenshot was made based on the following:
- Number of T1 links per day: 10
- Verification percentage: 10 percent (conservatively) for each tier
- Number of T2 links per day: x20 per URL
- Number of T3 links per day: x20 per URL
- Number of T4 links per day: x20 per URL
- Time before verification: 1 day
- No T1A, T2A, etc just contextual T1, T2, T3 and kitchen sink T4 (T3A).
Note: In ron's strategy, T3A is assumed to be T4, because it links to T3.
Look at the figure (highlighted in blue).
At day 30, while still submitting 10 links to money site, the number of total links for T1 has grown to 29.
Total submission required for T2: 560. Total verified for T2: 756.
Total submission required for T3: 14,040. Total verified for T3: 11,700.
Total submission required for T4: 208,000.
Remember, this is only for one money site URL (T0). At the end of second month, at this rate, SER should be submitting to millions of links every day to satisfy the 20x per URL option for tier 3 links.
This is a problem everyone adopting tiered link building will face sooner rather than later. Even for such efficient setup as LeeG, an instance of SER can barely keep up with it, if I want to meet all links for each tier, every day.
In other words, even when everyone is using the same exact setup and options, a small decision such as how many projects I run per instance of SER will determine if my T2 and T3 links will get enough juice, or not at all for that day. And in turn that will impact indexation and ranking.
So help me please. How do you cope with this issue:
1) Just let SER does its best. In this case, I have a few questions: how does "Use verified URL of..." work? Does it pick URLs in sequence or random after each submission, or does it move to the next URL only after it has built 20 links for current URL? If the latter, how do you make sure the latest links are given their share of lower tiered links? Maybe @Sven can chime in and clarify how this feature behaves?
2) Use indexer service / software to help with indexing issues. (Assuming besides sending more power to upper tiered links, the purpose of lower tiered links is also to help getting them indexed.)
3) Re-verify every so often to minimize waste. Build links only to verified links that are still there. (I know at least ron does this every week.)
4) Delete verified URLs every so often so SER has more time to build links to new URLs. If so, how many days? How many links per upper tiered link before you stop and move on?
As a side note: Perhaps SER should have an option to make deletion of URLs older than xx days / weeks easier?
I can imagine this option becomes an important step. Perhaps I should move on after each T2 and T3 links got 30 verified links each (30 days) and delete them afterwards to give chances to newer links. Anyone's doing it?
5) Scale to another VPS / dedicated. If this is part of your strategy, how many projects and URLs do you run per instance of SER?
6) I think ron mentioned in another thread about keeping a spreadsheet of verified links from day to day. That is a great idea to keep track of progress. If too many projects are not progressing, in terms of growing verified links per day, then perhaps it's time to optimize first, then scale vertically / horizontally. (Request for "submitted for the day" and "verified for the day" columns beside each project, I think, is a great idea. https://forum.gsa-online.de/discussion/1147/new-feature-submitted-today-verified-today-by-project It will help in this case too, among others, as pointed out by ron in the thread.)
7) Other advice or ideas?
I know, the number is not etched in stone. Tiered link building is not perfect. Links are deleted, not every link should have specified amount of inbound links to avoid footprints, etc... but I'm not asking for exact figure either but some guidelines, hints, tips, anything to make it more cost effective and profitable for all of us.
Let's make this thread an interesting discussion for "scaling GSA SER". (I apologize for the long post.)
Comments
Wayyyyy too long of a post for starters.
I'll just cut to the chase. The number of links you build, and the number that are verified, have everything to do with the platforms and engines you select.
A long time ago, I had engines submitting tens of thousands of links with only 100 verified. I probably use only 1/3 of all the engines to make links. The other engines are way too inefficient probably because they are moderated or whatever.
You must compared the data on verified links vs. submitted links. That is where your problems lie.
You are getting into all this other stuff which really is not the issue.
Go to Options>Advanced>Tools>Show Stats>Submitted/Verified
Just export the data to a spreadsheet. Get both on the same page. Play with the spacing so that the same engines are on the same rows. Then on each row, divide verified by submitted.
You now have %'s of verified for each engine. You also have raw data on the absolute number of verified for each engine. I look at both of these numbers before deciding what are the worthless engines.
I have made the mistake of cutting engines from projects, only to find they magically got improved by @sven...So always have some projects where you run the 'bad engines' in some throw away projects just to gather data - I have had to reverse myself a few times with cut engines.
You really don't solve that issue. No one can. All you can do is keep trimming the dead links, and make the top of the tree as perfect as you can so you don't waste link building underneath and SER resources. The biggest part of not wasting SER resources is what I said - make sure you use the most productive engines.
What happens with all those tiers underneath? The actual plan doesn't require 20X links literally. It sure would be nice, but no software can do that for all your projects. SER handles it to the best of its ability.
Everything you wrote is correct on all of your points. Setting tiers up properly is the majority of the battle. If you are using scheduler, you can handle an unlimited number of projects. The problem is that each project only gets a certain amount of sunshine. You have to be able to make the decision on when you need to start expanding to another VPS based on how many links you think each project needs.
There are no rules of thumb because each website demands a different amount of links to rank and stay ranked.
Expansion to another VPS and additional copies of SER and CB is truly a minor consideration. If you are making money, and it's time to expand, you have about $200 of one-time GSA software cost and $50 per month for a VPS. That's a very small cost for expansion.
But as you plod along, you also need to cut out your crap. Quit hanging on to loser websites. Quit building links and wasting SER resources on websites that aren't worth a damn. Learn to quit wasting time breathing life into the dead, and start creating new websites. Only a few seeds that you plant will really take root. That's the way this business works.
And after all this trimming and adding, you might find yourself only needing the same VPS and the same copies of SER and CB. But slowly you evolve to making 5X the money you were making just one year ago with the exact same set of tools, and the exact same VPS. Then you expand for real, not because your ego demands it, but because you truly have reached your limit with one SER, one CB and one VPS.
If you read Ron's stuff He basically gives you the blueprint for making money Minus the keyword research. Once you've got that down + what Ron has said you are pretty much set
That's why I think this forum is better than a lot of the others I've read. If I read the kind of information that ron and others have given like 3-4 years ago instead of piecing everything together myself It would be a totally different game for me right now.
@alex It all depends on the amount of competing pages and the niche i am going for. I'm always changing them or sometimes I don't even use secondary anchor text. it literally depends on the other sites on the first page and the competing pages. IE Competitive niches I obviously have to be a bit more aggressive. not so competitive niches I can kinda ease up on it.