Unfortunately SEO is not just about building tiered links and expecting to rank. You've got things like link loss, indexing, authority of link sources, anchor ratios as well as changing on page seo factors that will effect the results you see. The niche and competitiveness of the keywords will of course play a major role too.
On top of that you've also got Google changing their algorithm more often than ever where they hand pick who they want to boost in rankings such as recent updates giving sites such as Reddit and quora extra boosts in rankings.
In some niches, all I'm seeing ranking on page 1 are social media sites like twitter, facebook, instagram, and tik tok. It's a good time to be using these as parasites to rank for your keywords. Will be easier and quicker than trying to rank your own money site.
Keyword Competitiveness Long tails will be the easiest to rank and that's where I'd recommend anyone to focus on their seo. You'll see much higher conversion rates and quicker rankings versus high competition 1 word/2 word phrases. With Google focusing on "relevancy" I've seen better results when a page is optimised with the exact target keyword (or group of related keywords). The exact match keyword in the url, meta title and meta description.
Historically, I would have had one landing page covering a range of related keywords on one topic and building tiers to this page would result in hundreds of related keywords ranking. But now a page with the exact match keyword in the url, meta title and meta description will rank higher than a page with longer content but the exact match keyword is not in the url, meta title and meta description. Simply because it's deemed to be more relevant to the users search query.
Domain Authority/Page Authority This is a really important metric to focus on. It represents how easily your site can rank for your keywords. The problem with this metric though is that no one has any idea what Google percieves your DA to be. Since they abandoned page rank, this metric is only being provided by 3rd party tools. In reality all SEO's are working blind on this metric.
Relying on metrics provided by services such as moz DA or ahrefs DR or even majestic SEO TF/CF is just no good. They do not coincide with Google. Boosting your DA/DR and TF does not necessarily increase your rankings.
These tools use caching so the data they represent is always out of date.
They don't crawl the entire web - only a fraction of the web is crawled by these tools versus what google crawls.
The data they show has no correlation to the links in google's index. They don't solely show you google indexed links, they show you both indexed and non-indexed links.
Plus with many sites having bot blocking plugins installed, these tools will never be able to have the same database of links that Google has.
This means their metrics are based on an incomplete data set, making their metrics pretty useless for making business decisions. Still many SEOs are hung up on the data from these services, despite them providing crap data.
As SEOs, we do know that links from unique domains will boost DA/PA. The amount of link juice that's passed will be impacted by the number of external links the site has. More external links means less link juice is passed from the site overall.
So heavily spammed sites that sit inside automated software will pass less and less link juice the more they are spammed. This is exactly what's happenning with tools like GSA SER and RankerX. Unspammed sites or even a PBN could in theory pass more link juice if it has less external links than these spammed platforms.
Indexing Whilst this may be an obvious factor as being important for rankings, it's important to understand that as links get indexed and deindexed, your DA/PA value held by Google will fluctuate up and down. Unfortunately no one can see this data - Google no longer share this.
The DA/PA metrics provided by 3rd party tools don't change as links are indexed or deindexed. Even links that die can take months to be reflected by 3rd party tools as they use caching to save on costs.
Link Loss Dead links will be impacting your google ranking everytime google refreshes their index. You need to have a system in place to monitor the indexing and live status of your links. Whilst GSA SER has a reverify option, it's not 100% accurate (although it's still pretty good) and won't be able to continuously monitor the link status of a project, unless you leave that project running indefinitely.
I use another tool which I've mentioned before for managing the link loss. https://www.inspyder.com/products/BacklinkMonitor With this I can stay on top of any link loss and laser target my tiered link building. Makes building and powering up a 3 tier structure much easier. Plus if you build each tier separately, you can have a very strong link profile with just do follow live links, removing the dead links/no follow links from each tier as you build them and link check them.
To give you an example of what these automated tools are doing. My automated 1-3-9 strategy with just do follow contextuals results in about 30% link loss/dead links when checked through backlink monitor. Bear in mind that I don't have any delays in GSA SER - It builds the next tier as soon as there are 100 live links. Using delays would definitely give better results, but I'm in a hurry lol
But by running each tier separately with a link check after each tier is built, the link loss/dead links is reduced to almost 0%, as they are removed before building the next tier. A much more efficient approach to doing SEO. Ofcourse there will be additional link loss over the following months, but as it's being managed by another software, this is quite easy to monitor and plan new campaigns to replace those lost links in the tiers. Replacing these lost links and repowering the tiers with indexed links is how to recover/boost rankings. The ranking power comes from indexed links in your tiers. Authority Of Link Sources This is another critical factor that will effect the results you see. The higher authority link sources should be in your T1 and also your T2. If you are using just GSA SER as T1/2/3, then you are already limiting the results of the strategy. If your T1/2 link sources have DA0-DA20, then your foundation doesn't have a lot of link juice to start with. Powering up these further with tiers isn't going to do much to your rankings as there is not a lot of link juice in the tiered structure to start with.
For long tails it works good, and also site wide campaigns building links to hundreds of urls is another good strategy that will rank long tails quite easily to page 1. Whilst I do run T1/2/3 links with GSA SER with my 1-3-9 strategy, that's not all I do, as it's not enough to rank on it's own.
I'll outsource to other services, use my own PBN network, use rankerx which gives about 300 do follow links from high DA unique domains.
There is also manual link building on high DA sites and even outreach link building. You need to think beyond just 1 set of link sources. The more unique domains with high DA that you can place links on, the higher and quicker you will rank. Knowing this, why would you rely on just 1 software? SEO is about being competitive so continuously be on the lookout for new link sources. Anchor Ratios The ratios are tough to control. Although these 3rd party tools like ahrefs and seo spyglass have anchor clouds which make it very easy to visualise your anchor ratios, their data is based on indexed/non-indexed links combined. As links are deindexed or become dead, your anchor ratios will be changing with what Google see in their index. An over optimised (too many keyword anchors) anchor profile will make your rankings go backwards.
Whilst exact match anchors in your links will boost specific keyword rankings they need to be balanced with generics, branding and other variations of related keyword anchors. To play it safe, your top anchors should be variations of branding anchors only. Your keyword anchors should be less than 1%.
Comments
On top of that you've also got Google changing their algorithm more often than ever where they hand pick who they want to boost in rankings such as recent updates giving sites such as Reddit and quora extra boosts in rankings.
In some niches, all I'm seeing ranking on page 1 are social media sites like twitter, facebook, instagram, and tik tok. It's a good time to be using these as parasites to rank for your keywords. Will be easier and quicker than trying to rank your own money site.
Keyword Competitiveness
Long tails will be the easiest to rank and that's where I'd recommend anyone to focus on their seo. You'll see much higher conversion rates and quicker rankings versus high competition 1 word/2 word phrases. With Google focusing on "relevancy" I've seen better results when a page is optimised with the exact target keyword (or group of related keywords). The exact match keyword in the url, meta title and meta description.
Historically, I would have had one landing page covering a range of related keywords on one topic and building tiers to this page would result in hundreds of related keywords ranking. But now a page with the exact match keyword in the url, meta title and meta description will rank higher than a page with longer content but the exact match keyword is not in the url, meta title and meta description. Simply because it's deemed to be more relevant to the users search query.
Domain Authority/Page Authority
This is a really important metric to focus on. It represents how easily your site can rank for your keywords. The problem with this metric though is that no one has any idea what Google percieves your DA to be. Since they abandoned page rank, this metric is only being provided by 3rd party tools. In reality all SEO's are working blind on this metric.
Relying on metrics provided by services such as moz DA or ahrefs DR or even majestic SEO TF/CF is just no good. They do not coincide with Google. Boosting your DA/DR and TF does not necessarily increase your rankings.
- These tools use caching so the data they represent is always out of date.
- They don't crawl the entire web - only a fraction of the web is crawled by these tools versus what google crawls.
- The data they show has no correlation to the links in google's index. They don't solely show you google indexed links, they show you both indexed and non-indexed links.
- Plus with many sites having bot blocking plugins installed, these tools will never be able to have the same database of links that Google has.
- This means their metrics are based on an incomplete data set, making their metrics pretty useless for making business decisions. Still many SEOs are hung up on the data from these services, despite them providing crap data.
As SEOs, we do know that links from unique domains will boost DA/PA. The amount of link juice that's passed will be impacted by the number of external links the site has. More external links means less link juice is passed from the site overall.So heavily spammed sites that sit inside automated software will pass less and less link juice the more they are spammed. This is exactly what's happenning with tools like GSA SER and RankerX. Unspammed sites or even a PBN could in theory pass more link juice if it has less external links than these spammed platforms.
Indexing
Whilst this may be an obvious factor as being important for rankings, it's important to understand that as links get indexed and deindexed, your DA/PA value held by Google will fluctuate up and down. Unfortunately no one can see this data - Google no longer share this.
The DA/PA metrics provided by 3rd party tools don't change as links are indexed or deindexed. Even links that die can take months to be reflected by 3rd party tools as they use caching to save on costs.
Link Loss
Dead links will be impacting your google ranking everytime google refreshes their index. You need to have a system in place to monitor the indexing and live status of your links. Whilst GSA SER has a reverify option, it's not 100% accurate (although it's still pretty good) and won't be able to continuously monitor the link status of a project, unless you leave that project running indefinitely.
I use another tool which I've mentioned before for managing the link loss. https://www.inspyder.com/products/BacklinkMonitor
With this I can stay on top of any link loss and laser target my tiered link building. Makes building and powering up a 3 tier structure much easier. Plus if you build each tier separately, you can have a very strong link profile with just do follow live links, removing the dead links/no follow links from each tier as you build them and link check them.
To give you an example of what these automated tools are doing. My automated 1-3-9 strategy with just do follow contextuals results in about 30% link loss/dead links when checked through backlink monitor. Bear in mind that I don't have any delays in GSA SER - It builds the next tier as soon as there are 100 live links. Using delays would definitely give better results, but I'm in a hurry lol
But by running each tier separately with a link check after each tier is built, the link loss/dead links is reduced to almost 0%, as they are removed before building the next tier. A much more efficient approach to doing SEO. Ofcourse there will be additional link loss over the following months, but as it's being managed by another software, this is quite easy to monitor and plan new campaigns to replace those lost links in the tiers. Replacing these lost links and repowering the tiers with indexed links is how to recover/boost rankings. The ranking power comes from indexed links in your tiers.
Authority Of Link Sources
This is another critical factor that will effect the results you see. The higher authority link sources should be in your T1 and also your T2. If you are using just GSA SER as T1/2/3, then you are already limiting the results of the strategy. If your T1/2 link sources have DA0-DA20, then your foundation doesn't have a lot of link juice to start with. Powering up these further with tiers isn't going to do much to your rankings as there is not a lot of link juice in the tiered structure to start with.
For long tails it works good, and also site wide campaigns building links to hundreds of urls is another good strategy that will rank long tails quite easily to page 1. Whilst I do run T1/2/3 links with GSA SER with my 1-3-9 strategy, that's not all I do, as it's not enough to rank on it's own.
I'll outsource to other services, use my own PBN network, use rankerx which gives about 300 do follow links from high DA unique domains.
There is also manual link building on high DA sites and even outreach link building. You need to think beyond just 1 set of link sources. The more unique domains with high DA that you can place links on, the higher and quicker you will rank. Knowing this, why would you rely on just 1 software? SEO is about being competitive so continuously be on the lookout for new link sources.
Anchor Ratios
The ratios are tough to control. Although these 3rd party tools like ahrefs and seo spyglass have anchor clouds which make it very easy to visualise your anchor ratios, their data is based on indexed/non-indexed links combined. As links are deindexed or become dead, your anchor ratios will be changing with what Google see in their index. An over optimised (too many keyword anchors) anchor profile will make your rankings go backwards.
Whilst exact match anchors in your links will boost specific keyword rankings they need to be balanced with generics, branding and other variations of related keyword anchors. To play it safe, your top anchors should be variations of branding anchors only. Your keyword anchors should be less than 1%.