Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
If it's not too much to ask, can you tell me what types of links you create in the levels?
For tier 1 only articles right?
and for the rest of the levels that you use?
For T1 I'm using do follow links only, preferably with keyword anchor text, although there a few engines which only do url anchors, which is fine if the keyword is in the url of your landing page. Personally I ignore the name of the platform/engine and only look at if the link is dofollow with keyword anchor, that's what's important on T1.
This looks like this:
Select the engines you see in the screenshot and then right click and choose "uncheck engines with no/do follow links. Then disable the option for "doing both" as shown below. This will deselect all the no follow engines.
I also deselect fluxbb forum as for me they're all no follow. Sometimes I deselect discuz forum if pointing links at my homepage, as they are all url anchor links. So depends on what you want the software to do. You need to improvise when needed.
Number of links made will depend on how good your site list is. This selection of engines are the most rare in the software but also the most valuable to use as T1 links. So get scraping lol
For T2/3 I'm using a different selection. Again it's all about do follow links:
Same as above selection with 134 engines, but I've added url shorteners, exploits and blog comments. You could add guestbooks and image comments too which are also good link sources for tiers, if you can find the sites.
This is just one way of doing things. I'm testing right now with just 2 tiers as 3 tiers is quite resource intensive. I'm still not sold on the 2 tier strategy, but am aware lots of users do just 2 tiers, in some cases they do just 1 tier. But for the keywords I'm chasing, I don't think it's powerful enough to compete. I'm talking about keywords with 10k-100k searches/month. Big boy keywords! lol
2 tiers is a lot easier to manage, so would be nice if I can make the 2 tier strategy work for me.
Learn how to scrape and build your own lists. Buying lists every time you need new link sources will get you nowhere. Lists are always over sold and get burnt quick. Plus they are all poop anyway with hardly any sites in them.
Most list sellers sell you an identified list (raw scraped non working sites) and pretend it's a verfiied list (100% working sites).
Then users come on the forum complaining about slow speeds and low vpm. This is what happens when you pay for poop.
Building your own lists is the way to go. Don't be one of these users that wastes money on lists and then wonders why they have slow speeds with the software.
It just wastes everyones time and the list sellers make more money lol
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
I
What if you use this setup T1- 1 project T2- 1 project, pointed at T1
You post, say e.g., 1000 links to T1, then 3000 links to T2 Does this also result in each T1 link getting 3 T2 links on average?
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
I
What if you use this setup T1- 1 project T2- 1 project, pointed at T1
You post, say e.g., 1000 links to T1, then 3000 links to T2 Does this also result in each T1 link getting 3 T2 links on average?
Yes it does. But I wouldn't stop at just 1 T2 project. More T2 links will increase the page authority and move your rankings up further. If you add in T3 links, then your T2 will increase their page authority and make the T1 links even more powerful.
Either way will still work. It's about doing enough in your tiers to make the T1 powerful enough to make your keywords rank.
Sickseo A new question, sorry to bother you again.
If I publish my links on sites that are not in the same language as my site, how much does it affect the result?
This is because there are very few sites in Spanish where you can leave a link and many in English.
Do you have experience in this? In other words, can it negatively affect my site? Or can I have good results in the same way?
I don't see how it could have negative results. Google have said that it's fine to have links from different language sites. You can't control who links to you.
In terms of relevance, posting content with your link in your target language is as far as one can optimise things for. Posting on different language sites is fine. If you saw the variation of countries that my link sources are from. Most are foreign sites and the links still move my rankings up.
For T2/3 I'm using a different selection. Again it's all about do follow links:
Same as above selection with 134 engines, but I've added url shorteners, exploits and blog comments.
Hey sickseo. I've been reading all your posts on how you structure your campaigns. Just wondering your thoughts about a few of your choices for T2/T3. You added blog comments to the 2nd tier. Now I know you usually do a very large T3. So just wondering if it's the best use of resources to build T3 links to your blog comments as the nature of blog comments is that there will a lot of dilution of power as there could be many links on that page. So just wondering if you still include them in your T2 or is this more T3?
And just wondering what value you find with Exploits? I can't seem to find an answer about why they would be used and how effective they are and pushing power. Could you elaborate on this please?
And lastly, you have added URL shorteners to T2. Do you still do it this way. I was thinking they would be more effective at T3 as again personally I would think it would be a waste of resources to point mass T3 links at shorteners if they were on T2. But just wanted to get your thoughts on this.
Thanks again for all your shares. I'm taking serious notes.
Regarding blog comments. I set the obl fiilter to 100 obls, so any high obl links are avoided. There are some ridiculously high DA sites amongst blog comments. If the obls are less than 100, then it makes it worthwhile, even good enough for T1.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
First of all, Thanks for sharing all of this knowledge!
I don't understand though the concept of 3 projects for each tier. What is different between amongst these three projects? It is the same platforms. Would it not be the same if running a single tier project until it has created 3 x the links of the upper project?
Regarding blog comments. I set the obl fiilter to 100 obls, so any high obl links are avoided. There are some ridiculously high DA sites amongst blog comments. If the obls are less than 100, then it makes it worthwhile, even good enough for T1.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
Thanks so much for all your detailed responses. Absolutely brilliant.
Just to summarize what you have said in this awesome thread:
Requirements For All Tiers:
Do follow is a must Contextual Keyword anchor text (If possible)
T1
Engines:
Article Forum Microblog Social Bookmarking Social Network Web 2.0 Wiki
T2/T3
Engines:
Article Forum Microblog Social Bookmark Social Network Web 2.0 Wiki
Additional engines Sickseo adds To T2/T3 that were not part of T1:
Blog Comment (Filter: Outbound Links: less than 100) Exploit URL Shortener (Sometimes Sickseo uses these as T1 links also).
Additional engines to consider adding to T2/T3 (optional):
Guestbooks Image Comments
Just a few questions to clear some things up for me:
1) Would you ever use Exploits on T1 or is this strictly for T2 and T3?
2) So I’m assuming you verify the links that are built on T1 and T2. But do you verify links at T3 or do you just blast away without needing to spend extra resources verifying at this level?
3) When adding the tier. Whether T2 or T3 I’m sure you select “Do follow only” but do you also select “Use anchor text from the verified url”? And do you have any other settings you apply to the tier filter when the box pops up?
Do you build links to blog comments, exploits and URL shorteners? Or do you just stick to the main engines from T1?
Any other settings do you recommend for the “tier filter” box?
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
First of all, Thanks for sharing all of this knowledge!
I don't understand though the concept of 3 projects for each tier. What is different between amongst these three projects? It is the same platforms. Would it not be the same if running a single tier project until it has created 3 x the links of the upper project?
The only reason why there are 3 projects is that more projects means more links. 1 project wil only make so many links, So having 3 of them means 3 times more links in that tier. I've got templates that use 10 projects in each tier instead of 1, so that's 10 times more links in each tier versus using 1 project in the T2/T3.
My 2 tier template 1-100 uses 100 projects in the T2, so that's 100 times more links versus using 1 project in T2. This means that the referring domains for the T1 links will be significantly higher.
Regarding blog comments. I set the obl fiilter to 100 obls, so any high obl links are avoided. There are some ridiculously high DA sites amongst blog comments. If the obls are less than 100, then it makes it worthwhile, even good enough for T1.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
Thanks so much for all your detailed responses. Absolutely brilliant.
Just to summarize what you have said in this awesome thread:
Requirements For All Tiers:
Do follow is a must Contextual Keyword anchor text (If possible)
T1
Engines:
Article Forum Microblog Social Bookmarking Social Network Web 2.0 Wiki
T2/T3
Engines:
Article Forum Microblog Social Bookmark Social Network Web 2.0 Wiki
Additional engines Sickseo adds To T2/T3 that were not part of T1:
Blog Comment (Filter: Outbound Links: less than 100) Exploit URL Shortener (Sometimes Sickseo uses these as T1 links also).
Additional engines to consider adding to T2/T3 (optional):
Guestbooks Image Comments
Just a few questions to clear some things up for me:
1) Would you ever use Exploits on T1 or is this strictly for T2 and T3?
2) So I’m assuming you verify the links that are built on T1 and T2. But do you verify links at T3 or do you just blast away without needing to spend extra resources verifying at this level?
3) When adding the tier. Whether T2 or T3 I’m sure you select “Do follow only” but do you also select “Use anchor text from the verified url”? And do you have any other settings you apply to the tier filter when the box pops up?
Do you build links to blog comments, exploits and URL shorteners? Or do you just stick to the main engines from T1?
Any other settings do you recommend for the “tier filter” box?
1) Right now i've got all servers running exploits on T1/T2/T3. I've been using exploits on T1 for well over 1 year now lol They are do follow links with keyword anchors and some have decent DA too as the sites are real sites. The last month I stopped using exploits on T1 to see if results were better. Just using 134 engines and the link wheel option on T1.
Personally, I see better ranking results when using exploits on T1. Although I should do an isolated test with just exploits on T1 and no other engines to see what they're really capable of. But use at your own risk. This link source is very blackhat. But then any artifical link building is considered black hat lol
2) T1 links will have the re-verify option enabled. T2/3 links get deleted once finished so no need to have re-verify option here.
3) I only enable the do follow option. Up to you if you want to use some of the other filter options. I used to, but don't anymore. I'm just interested in every do follow link getting inbound links, regardless of anchor text or engine.
Blog comments, exploits, url shorteners do get links built to them if they're do follow. I don't filter based on engine. Every do follow link from every tier gets links built to them.
Remember that this is just one strategy. But do follow links across tiers is what matters. I've got 2 tier campaigns and 3 tier campaigns running right now. Some are set up with 137 engines on all 3 tiers, which are all do follow link sources including exploits. Excludes blog comments and url shorteners.
Others are set up with 137 engines on T1 and 248 engines on T2/3 which has the extra blog comments and url shorteners. I've even got single tier blasts running 248 engines pointing at money robot and rankerx links. There is literally an endless way of using the software to build links.
For my case, the problem with Tier 1/2/3 is only the RAM limitation of GSA SER 3Gb, I'm running with 4 T1 projects and only can make 1 T2 active (I used "per URL" for T2). Also, I used scheduled but it's not as I expected. I hope there 64-bit version, but I guess it's impossible, So maybe the solution is to get more licenses and more VPS, which is not feasible at the moment with the project budget.
The only reason why there are 3 projects is that more projects means more links. 1 project wil only make so many links, So having 3 of them means 3 times more links in that tier. I've got templates that use 10 projects in each tier instead of 1, so that's 10 times more links in each tier versus using 1 project in the T2/T3.
My 2 tier template 1-100 uses 100 projects in the T2, so that's 100 times more links versus using 1 project in T2. This means that the referring domains for the T1 links will be significantly higher.
Got it, thanks.
How are you setting-up these sub-projects, manually? It appears I can create only one tier project in GSA using "Modify Project - Duplicate - Add a tier project". If manually, do you copy/paste the T1 verified links into the T2 URLs?
Duplicate > add tier option is for adding extra tiers. But for duplicating projects in same tier, use the modify project > duplicate just data/options. With just these 2 options you can build any tiered link structure of any size.
For my case, the problem with Tier 1/2/3 is only the RAM limitation of GSA SER 3Gb, I'm running with 4 T1 projects and only can make 1 T2 active (I used "per URL" for T2). Also, I used scheduled but it's not as I expected. I hope there 64-bit version, but I guess it's impossible, So maybe the solution is to get more licenses and more VPS, which is not feasible at the moment with the project budget.
That doesn't sound right. I can have about 100-150 projects active at same time on same install.
Normally memory usage is fine until it gets to about 4 million urls. But this does depend on the type of urls you use.
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
@sickseo may I know what is the quality of the T1 content that you are using to point backlinks to your money sites? high quality manual spun content or the typical tools that rewrite content automatically for the purpose of uniqueness?
Using article forge and word ai on all tiers. Very good quality content that's keyword optimised. For titles I'm using the built in title spinner in GSA, so urls will contain the keywords i'm ranking for.
I've used kontent machine in the past, and to be fair, I was still seeing results with poop scraped/spun content. I only invested in the AI tools in December, 5 months ago.
Regarding blog comments. I set the obl fiilter to 100 obls, so any high obl links are avoided. There are some ridiculously high DA sites amongst blog comments. If the obls are less than 100, then it makes it worthwhile, even good enough for T1.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
Thanks so much for all your detailed responses. Absolutely brilliant.
Just to summarize what you have said in this awesome thread:
Requirements For All Tiers:
Do follow is a must Contextual Keyword anchor text (If possible)
T1
Engines:
Article Forum Microblog Social Bookmarking Social Network Web 2.0 Wiki
T2/T3
Engines:
Article Forum Microblog Social Bookmark Social Network Web 2.0 Wiki
Additional engines Sickseo adds To T2/T3 that were not part of T1:
Blog Comment (Filter: Outbound Links: less than 100) Exploit URL Shortener (Sometimes Sickseo uses these as T1 links also).
Additional engines to consider adding to T2/T3 (optional):
Guestbooks Image Comments
Just a few questions to clear some things up for me:
1) Would you ever use Exploits on T1 or is this strictly for T2 and T3?
2) So I’m assuming you verify the links that are built on T1 and T2. But do you verify links at T3 or do you just blast away without needing to spend extra resources verifying at this level?
3) When adding the tier. Whether T2 or T3 I’m sure you select “Do follow only” but do you also select “Use anchor text from the verified url”? And do you have any other settings you apply to the tier filter when the box pops up?
Do you build links to blog comments, exploits and URL shorteners? Or do you just stick to the main engines from T1?
Any other settings do you recommend for the “tier filter” box?
1) Right now i've got all servers running exploits on T1/T2/T3. I've been using exploits on T1 for well over 1 year now lol They are do follow links with keyword anchors and some have decent DA too as the sites are real sites. The last month I stopped using exploits on T1 to see if results were better. Just using 134 engines and the link wheel option on T1.
Personally, I see better ranking results when using exploits on T1. Although I should do an isolated test with just exploits on T1 and no other engines to see what they're really capable of. But use at your own risk. This link source is very blackhat. But then any artifical link building is considered black hat lol
2) T1 links will have the re-verify option enabled. T2/3 links get deleted once finished so no need to have re-verify option here.
3) I only enable the do follow option. Up to you if you want to use some of the other filter options. I used to, but don't anymore. I'm just interested in every do follow link getting inbound links, regardless of anchor text or engine.
Blog comments, exploits, url shorteners do get links built to them if they're do follow. I don't filter based on engine. Every do follow link from every tier gets links built to them.
Remember that this is just one strategy. But do follow links across tiers is what matters. I've got 2 tier campaigns and 3 tier campaigns running right now. Some are set up with 137 engines on all 3 tiers, which are all do follow link sources including exploits. Excludes blog comments and url shorteners.
Others are set up with 137 engines on T1 and 248 engines on T2/3 which has the extra blog comments and url shorteners. I've even got single tier blasts running 248 engines pointing at money robot and rankerx links. There is literally an endless way of using the software to build links.
Have u noticed any difference enabling the linkwheel option?
Regarding blog comments. I set the obl fiilter to 100 obls, so any high obl links are avoided. There are some ridiculously high DA sites amongst blog comments. If the obls are less than 100, then it makes it worthwhile, even good enough for T1.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
Thanks so much for all your detailed responses. Absolutely brilliant.
Just to summarize what you have said in this awesome thread:
Requirements For All Tiers:
Do follow is a must Contextual Keyword anchor text (If possible)
T1
Engines:
Article Forum Microblog Social Bookmarking Social Network Web 2.0 Wiki
T2/T3
Engines:
Article Forum Microblog Social Bookmark Social Network Web 2.0 Wiki
Additional engines Sickseo adds To T2/T3 that were not part of T1:
Blog Comment (Filter: Outbound Links: less than 100) Exploit URL Shortener (Sometimes Sickseo uses these as T1 links also).
Additional engines to consider adding to T2/T3 (optional):
Guestbooks Image Comments
Just a few questions to clear some things up for me:
1) Would you ever use Exploits on T1 or is this strictly for T2 and T3?
2) So I’m assuming you verify the links that are built on T1 and T2. But do you verify links at T3 or do you just blast away without needing to spend extra resources verifying at this level?
3) When adding the tier. Whether T2 or T3 I’m sure you select “Do follow only” but do you also select “Use anchor text from the verified url”? And do you have any other settings you apply to the tier filter when the box pops up?
Do you build links to blog comments, exploits and URL shorteners? Or do you just stick to the main engines from T1?
Any other settings do you recommend for the “tier filter” box?
1) Right now i've got all servers running exploits on T1/T2/T3. I've been using exploits on T1 for well over 1 year now lol They are do follow links with keyword anchors and some have decent DA too as the sites are real sites. The last month I stopped using exploits on T1 to see if results were better. Just using 134 engines and the link wheel option on T1.
Personally, I see better ranking results when using exploits on T1. Although I should do an isolated test with just exploits on T1 and no other engines to see what they're really capable of. But use at your own risk. This link source is very blackhat. But then any artifical link building is considered black hat lol
2) T1 links will have the re-verify option enabled. T2/3 links get deleted once finished so no need to have re-verify option here.
3) I only enable the do follow option. Up to you if you want to use some of the other filter options. I used to, but don't anymore. I'm just interested in every do follow link getting inbound links, regardless of anchor text or engine.
Blog comments, exploits, url shorteners do get links built to them if they're do follow. I don't filter based on engine. Every do follow link from every tier gets links built to them.
Remember that this is just one strategy. But do follow links across tiers is what matters. I've got 2 tier campaigns and 3 tier campaigns running right now. Some are set up with 137 engines on all 3 tiers, which are all do follow link sources including exploits. Excludes blog comments and url shorteners.
Others are set up with 137 engines on T1 and 248 engines on T2/3 which has the extra blog comments and url shorteners. I've even got single tier blasts running 248 engines pointing at money robot and rankerx links. There is literally an endless way of using the software to build links.
what have you found to be a good reverification setting? ever pay attention to what gets deleted in x timeframe and adjusted?
r u still sending t2 and t3 links to an indexer? y delete t2 or even t3 links?
@sickseo im only doing 20vpm (building to most engines except for documents, RSS) on 500 threads with good private proxies and high end servers. Was wondering if you can share with us how you achieve such high vpm (apart from scrapping your own lists).
I am looking into scrapping my own lists now though
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
I think this thread can be a sticky post as Best Practive Guide.
In T1, do you use all links that you scrape, or do you only use links with higher Domain Authority (DA) like DA 15+?
Thank you
Here's the thing about DA. It's a Moz metric and not a Google metric. Incredibly unreliable. Plenty of 3rd party data services out there like ahrefs, moz, majestic, seo spyglass. They all have their own version of DA. The figure is never the same for a domain when you check them. It wouldn't be either as none of them have a database the size of Google, only a fraction of it. All of these data sources are incrediby unreliable.
I pay no attention to DA. It's useless as a metric. You'd think links from high DA sites is what it's all about, when in actual fact it's about the number of external links that the high DA site has as well. The higher the number of external links, the less link juice gets passed through each link, meaning you won't get as much of a ranking boost as you'd expect.
A low DA site with less external links can actually pass more link juice and have a greater impact on rankings than a higher DA site with a greater number of external links.
I also manipulate the DA of every single site in my site list by building links to the homepage automatically within my campaigns. 10% of all links are automatically sent to the homepage across all tiers. So even newly scraped sites with DA0 will automatically have the homepage powered up with links. They don't stay at DA0 for very long lol The knock on effect is that the page authority of the pages that hold my links also increase in PA as the DA increases.
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
Each time 1 project runs, it will create "x" amount of links. If you run 2 projects simultaneously, it will create double the amount of links in the same time frame. If you run 10 projects simultaneously, then you're now making 10 times the number of links in the same time frame.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
The website is work in progress, but glad you like it. Took a while to get the design right to something I really liked. It's a bit thin on the content side, but I've got lots of SEO content planned, when I have the time.
My site list stats at the moment are just over 50k unique domains across all engines. Most of these are redirect and indexer links though. Probably about half of those are do follow.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
For indexing, I abandoned indexing/crawling services as they can't cope with the volume of links I send them lol Pricing is extortionate for the volume of links I build. One service even banned me lol
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
1. What are you using MoneyRobot for exactly? From my experience, MoneyRobot builds toxic backlinks, from their own web farm and every MoneyRobot customer is building backlinks on the same properties and has done for the past 10 or more years.
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
1. I'm running money robot links in the same way as any other tool. Direct to money site, as well as to power up other links. You're right about the site list being a private link farm. All owned by the software developer.
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol There was FCS networker which also did the same thing, thousands of users hammering the same group of sites. Now we have serlib which is allowing users to do the same thing. It's really not an issue at all. The only real question is "do google index these links?"
I know they show as toxic links if you use sem rush. I also use seo spyglass to monitor links but they don't show up as toxic links in there - links have zero risk of penalty. Here's the thing, neither of these 3rd party services represent what google sees and thinks. Personally I pay zero attention to any of these 3rd party tools.
If you use the "site" command on the web 2.0 blogs, bookmarks and wiki sites, domains are all indexed in google with thousands of pages showing as indexed from each property. They are permanent links, 100% do follow and also indexable by google. So aside from what a 3rd party data service is saying about links being toxic, they seem to be the perfect link source.
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
1. What are you using MoneyRobot for exactly? From my experience, MoneyRobot builds toxic backlinks, from their own web farm and every MoneyRobot customer is building backlinks on the same properties and has done for the past 10 or more years.
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
hey @rastarr, how you decided MR backlinks are toxic?
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
Each time 1 project runs, it will create "x" amount of links. If you run 2 projects simultaneously, it will create double the amount of links in the same time frame. If you run 10 projects simultaneously, then you're now making 10 times the number of links in the same time frame.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
Hi @sickseo Welcome back and glad that you come back.
It's not possible to increase the number of links that 1 project makes, as it depends on your site list and number of sites in there. Even increasing the number of accounts/posts per site doesn't always work as planned as not every platform supports this. You have to reset the project and run it again for it to make more links.
But if you know how to make a single project make more links, I'd love to hear how you do that.
1. I'm running money robot links in the same way as any other tool. Direct to money site, as well as to power up other links. You're right about the site list being a private link farm. All owned by the software developer.
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
Thanks again for the detailed reply, mate. They are much appreciated. [1] As a newbie who started with Money Robot, SEM Rush scared the crap out of me with their toxic link report. I've dusted off my life-time license and will plug it back into powering up other links. Great to know your real life experience.
What tool produced that weekly rank-checking report, if I may ask?
[2] It would be great if @sven could/would add this sitemap ability to gsa seo indexer and make the program much more useful. Obviously some changes need to be made but it would make it an even better indexing tool. Fingers crossed
I use it for keyword research as well as rank tracking. You can have it track thousands of keywords. Will need private proxies to run the checks but you have a quota each month where they use their proxies to run checks for you.
Comments
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s)
T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average
T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
This looks like this:
Select the engines you see in the screenshot and then right click and choose "uncheck engines with no/do follow links. Then disable the option for "doing both" as shown below. This will deselect all the no follow engines.
I also deselect fluxbb forum as for me they're all no follow. Sometimes I deselect discuz forum if pointing links at my homepage, as they are all url anchor links. So depends on what you want the software to do. You need to improvise when needed.
Number of links made will depend on how good your site list is. This selection of engines are the most rare in the software but also the most valuable to use as T1 links. So get scraping lol
For T2/3 I'm using a different selection. Again it's all about do follow links:
Same as above selection with 134 engines, but I've added url shorteners, exploits and blog comments. You could add guestbooks and image comments too which are also good link sources for tiers, if you can find the sites.
This is just one way of doing things. I'm testing right now with just 2 tiers as 3 tiers is quite resource intensive. I'm still not sold on the 2 tier strategy, but am aware lots of users do just 2 tiers, in some cases they do just 1 tier. But for the keywords I'm chasing, I don't think it's powerful enough to compete. I'm talking about keywords with 10k-100k searches/month. Big boy keywords! lol
2 tiers is a lot easier to manage, so would be nice if I can make the 2 tier strategy work for me.
Most list sellers sell you an identified list (raw scraped non working sites) and pretend it's a verfiied list (100% working sites).
Then users come on the forum complaining about slow speeds and low vpm. This is what happens when you pay for poop.
Building your own lists is the way to go. Don't be one of these users that wastes money on lists and then wonders why they have slow speeds with the software.
It just wastes everyones time and the list sellers make more money lol
T1- 1 project
T2- 1 project, pointed at T1
You post, say e.g., 1000 links to T1, then 3000 links to T2
Does this also result in each T1 link getting 3 T2 links on average?
In other words, can it negatively affect my site? Or can I have good results in the same way?
Either way will still work. It's about doing enough in your tiers to make the T1 powerful enough to make your keywords rank.
In terms of relevance, posting content with your link in your target language is as far as one can optimise things for. Posting on different language sites is fine. If you saw the variation of countries that my link sources are from. Most are foreign sites and the links still move my rankings up.
And just wondering what value you find with Exploits? I can't seem to find an answer about why they would be used and how effective they are and pushing power. Could you elaborate on this please?
And lastly, you have added URL shorteners to T2. Do you still do it this way. I was thinking they would be more effective at T3 as again personally I would think it would be a waste of resources to point mass T3 links at shorteners if they were on T2. But just wanted to get your thoughts on this.
Thanks again for all your shares. I'm taking serious notes.
Regarding exploits. I love these links lol Do follow with keyword anchors and I've seen the DA go as high as DA80 on some sites. Statistically, they are an excellent source of links. They are real sites, but pages are created using the php log system. The links don't look very nice and it's unlikely you'll see them get into the serving index. I use them across all tiers. Use them at your own risk! I seem to be the only person that's making the most of them.
Url shorteners/redirects come in all sorts of varieties, including do follow, no follow, keyword in url, static pages with your link as well as no physical page. The way the software has been programmed for these platforms, you can literally do a raw scrape and find working links. The software even makes links from you tube and google properties which are DA 100. So yes, I use them in T2/3 as it's a massive source of link juice. The last time I checked my site list I had maybe 40,000 unique domains. I still have over 1700 unique google domains. I've even used them as T1 to boost the referring ip's, which is another ranking factor. I have no idea why you would not want to be using these link sources.
I also set the tiered projects to only build links to those that are do follow. Blog comments and url shorteners are huge platforms and also mixed with a lot of no follow links or even temporary redirects. So although I use them in T2/3 the software will only build links to links that are do follow, whilst using both no follow and do follow link sources in the tiers.
You can try to micro manage things and decide which links are good for google and which links aren't. I suspect as you exclude more and more links from your sources, you'll end up with very few sites to play with. Even pretty looking contextual links with high DA such as sites in rankerx don't get indexed. So why would using an exploit be any different? Each link is a signal. Some get indexed some don't. But if it's crawled, then google know about it and it counts towards ranking calculations.
Personally, I have my own criteria for link sources. The main criteria being do follow, secondly keyword anchor texts. That's it. lol keyword in url helps a lot, but even a do follow link with a url anchor helps. Or even a page with a citation with no clickable link helps. These are all signals that google sees.
My approach is the more links the better. If it's do follow i'll use it as a link source.
First of all, Thanks for sharing all of this knowledge!
I don't understand though the concept of 3 projects for each tier. What is different between amongst these three projects? It is the same platforms. Would it not be the same if running a single tier project until it has created 3 x the links of the upper project?
Just to summarize what you have said in this awesome thread:
Requirements For All Tiers:
Do follow is a must
Contextual
Keyword anchor text (If possible)
T1
Engines:
Article
Forum
Microblog
Social Bookmarking
Social Network
Web 2.0
Wiki
T2/T3
Engines:
Article
Forum
Microblog
Social Bookmark
Social Network
Web 2.0
Wiki
Additional engines Sickseo adds To T2/T3 that were not part of T1:
Blog Comment (Filter: Outbound Links: less than 100)
Exploit
URL Shortener (Sometimes Sickseo uses these as T1 links also).
Additional engines to consider adding to T2/T3 (optional):
Guestbooks
Image Comments
Just a few questions to clear some things up for me:
1) Would you ever use Exploits on T1 or is this strictly for T2 and T3?
2) So I’m assuming you verify the links that are built on T1 and T2. But do you verify links at T3 or do you just blast away without needing to spend extra resources verifying at this level?
3) When adding the tier. Whether T2 or T3 I’m sure you select “Do follow only” but do you also select “Use anchor text from the verified url”? And do you have any other settings you apply to the tier filter when the box pops up?
Do you build links to blog comments, exploits and URL shorteners? Or do you just stick to the main engines from T1?
Any other settings do you recommend for the “tier filter” box?
My 2 tier template 1-100 uses 100 projects in the T2, so that's 100 times more links versus using 1 project in T2. This means that the referring domains for the T1 links will be significantly higher.
Personally, I see better ranking results when using exploits on T1. Although I should do an isolated test with just exploits on T1 and no other engines to see what they're really capable of. But use at your own risk. This link source is very blackhat. But then any artifical link building is considered black hat lol
2) T1 links will have the re-verify option enabled. T2/3 links get deleted once finished so no need to have re-verify option here.
3) I only enable the do follow option. Up to you if you want to use some of the other filter options. I used to, but don't anymore. I'm just interested in every do follow link getting inbound links, regardless of anchor text or engine.
Blog comments, exploits, url shorteners do get links built to them if they're do follow. I don't filter based on engine. Every do follow link from every tier gets links built to them.
Remember that this is just one strategy. But do follow links across tiers is what matters. I've got 2 tier campaigns and 3 tier campaigns running right now. Some are set up with 137 engines on all 3 tiers, which are all do follow link sources including exploits. Excludes blog comments and url shorteners.
Others are set up with 137 engines on T1 and 248 engines on T2/3 which has the extra blog comments and url shorteners. I've even got single tier blasts running 248 engines pointing at money robot and rankerx links. There is literally an endless way of using the software to build links.
How are you setting-up these sub-projects, manually? It appears I can create only one tier project in GSA using "Modify Project - Duplicate - Add a tier project". If manually, do you copy/paste the T1 verified links into the T2 URLs?
Normally memory usage is fine until it gets to about 4 million urls. But this does depend on the type of urls you use.
@sickseo may I know what is the quality of the T1 content that you are using to point backlinks to your money sites? high quality manual spun content or the typical tools that rewrite content automatically for the purpose of uniqueness?
I've used kontent machine in the past, and to be fair, I was still seeing results with poop scraped/spun content. I only invested in the AI tools in December, 5 months ago.
Have u noticed any difference enabling the linkwheel option?
r u still sending t2 and t3 links to an indexer? y delete t2 or even t3 links?
I am looking into scrapping my own lists now though
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
I think this thread can be a sticky post as Best Practive Guide.
In T1, do you use all links that you scrape, or do you only use links with higher Domain Authority (DA) like DA 15+?
Thank you
I pay no attention to DA. It's useless as a metric. You'd think links from high DA sites is what it's all about, when in actual fact it's about the number of external links that the high DA site has as well. The higher the number of external links, the less link juice gets passed through each link, meaning you won't get as much of a ranking boost as you'd expect.
A low DA site with less external links can actually pass more link juice and have a greater impact on rankings than a higher DA site with a greater number of external links.
I also manipulate the DA of every single site in my site list by building links to the homepage automatically within my campaigns. 10% of all links are automatically sent to the homepage across all tiers. So even newly scraped sites with DA0 will automatically have the homepage powered up with links. They don't stay at DA0 for very long lol The knock on effect is that the page authority of the pages that hold my links also increase in PA as the DA increases.
So yes, I use all links that I scrape.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
T1 - 1 project - 1000 links
T2 - 1 project - 1000 links
T3 - 1 project - 1000 links
That would create 1 inbound link to each link in the tier below. Let's look at something bigger:
T1 - 1 project - 1000 links
T2 - 3 projects - 3000 links
T3 - 9 projects - 9000 links
Now we're making 3 inbound links to each link in the tier below. Now we go even bigger:
T1 - 1 project - 1000 links
T2 - 10 projects - 10,000 links
T3 - 100 projects - 100,000 links
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
I saw your new website and it looks great ! goodluck with it !
can you share how many unique dofollow domains (target urls) you have?
I think this is the main "struggle" is to have more of target urls...
also, if you can share, what indexer service are you using these days.
Thanks!
My site list stats at the moment are just over 50k unique domains across all engines. Most of these are redirect and indexer links though. Probably about half of those are do follow.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
For indexing, I abandoned indexing/crawling services as they can't cope with the volume of links I send them lol Pricing is extortionate for the volume of links I build. One service even banned me lol
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
as always, thanks for sharing !!!
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol There was FCS networker which also did the same thing, thousands of users hammering the same group of sites. Now we have serlib which is allowing users to do the same thing. It's really not an issue at all. The only real question is "do google index these links?"
I know they show as toxic links if you use sem rush. I also use seo spyglass to monitor links but they don't show up as toxic links in there - links have zero risk of penalty. Here's the thing, neither of these 3rd party services represent what google sees and thinks. Personally I pay zero attention to any of these 3rd party tools.
If you use the "site" command on the web 2.0 blogs, bookmarks and wiki sites, domains are all indexed in google with thousands of pages showing as indexed from each property. They are permanent links, 100% do follow and also indexable by google. So aside from what a 3rd party data service is saying about links being toxic, they seem to be the perfect link source.
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
Hi @sickseo
Welcome back and glad that you come back.
I think @googlealchemist means:
Instead of:
T1 - 1 project - 1000 links / day
T2 - 3 projects - 3000 links / day
T3 - 9 projects - 9000 links / day
Maintain 13 projects.
We can:
T1 - 1 project - 1000 links / day
T2 - 1 project - 3000 links / day
T3 - 1 project - 9000 links / day
Maintain only 3 projects.
Assuming factors such as VPS speed, memory, and number of proxies etc can only create maximum of 13,000 links per day.
However, I notice that GSA running 10 projects simultanously can create a lot more links per day, compared to running only 1 project.
But if you know how to make a single project make more links, I'd love to hear how you do that.
[1] As a newbie who started with Money Robot, SEM Rush scared the crap out of me with their toxic link report.
I've dusted off my life-time license and will plug it back into powering up other links. Great to know your real life experience.
What tool produced that weekly rank-checking report, if I may ask?
[2] It would be great if @sven could/would add this sitemap ability to gsa seo indexer and make the program much more useful. Obviously some changes need to be made but it would make it an even better indexing tool. Fingers crossed
Thanks again for your insights.
I use it for keyword research as well as rank tracking. You can have it track thousands of keywords. Will need private proxies to run the checks but you have a quota each month where they use their proxies to run checks for you.