@sickseo im only doing 20vpm (building to most engines except for documents, RSS) on 500 threads with good private proxies and high end servers. Was wondering if you can share with us how you achieve such high vpm (apart from scrapping your own lists).
I am looking into scrapping my own lists now though
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
I think this thread can be a sticky post as Best Practive Guide.
In T1, do you use all links that you scrape, or do you only use links with higher Domain Authority (DA) like DA 15+?
Thank you
Here's the thing about DA. It's a Moz metric and not a Google metric. Incredibly unreliable. Plenty of 3rd party data services out there like ahrefs, moz, majestic, seo spyglass. They all have their own version of DA. The figure is never the same for a domain when you check them. It wouldn't be either as none of them have a database the size of Google, only a fraction of it. All of these data sources are incrediby unreliable.
I pay no attention to DA. It's useless as a metric. You'd think links from high DA sites is what it's all about, when in actual fact it's about the number of external links that the high DA site has as well. The higher the number of external links, the less link juice gets passed through each link, meaning you won't get as much of a ranking boost as you'd expect.
A low DA site with less external links can actually pass more link juice and have a greater impact on rankings than a higher DA site with a greater number of external links.
I also manipulate the DA of every single site in my site list by building links to the homepage automatically within my campaigns. 10% of all links are automatically sent to the homepage across all tiers. So even newly scraped sites with DA0 will automatically have the homepage powered up with links. They don't stay at DA0 for very long lol The knock on effect is that the page authority of the pages that hold my links also increase in PA as the DA increases.
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
Each time 1 project runs, it will create "x" amount of links. If you run 2 projects simultaneously, it will create double the amount of links in the same time frame. If you run 10 projects simultaneously, then you're now making 10 times the number of links in the same time frame.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
The website is work in progress, but glad you like it. Took a while to get the design right to something I really liked. It's a bit thin on the content side, but I've got lots of SEO content planned, when I have the time.
My site list stats at the moment are just over 50k unique domains across all engines. Most of these are redirect and indexer links though. Probably about half of those are do follow.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
For indexing, I abandoned indexing/crawling services as they can't cope with the volume of links I send them lol Pricing is extortionate for the volume of links I build. One service even banned me lol
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
1. What are you using MoneyRobot for exactly? From my experience, MoneyRobot builds toxic backlinks, from their own web farm and every MoneyRobot customer is building backlinks on the same properties and has done for the past 10 or more years.
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
1. I'm running money robot links in the same way as any other tool. Direct to money site, as well as to power up other links. You're right about the site list being a private link farm. All owned by the software developer.
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol There was FCS networker which also did the same thing, thousands of users hammering the same group of sites. Now we have serlib which is allowing users to do the same thing. It's really not an issue at all. The only real question is "do google index these links?"
I know they show as toxic links if you use sem rush. I also use seo spyglass to monitor links but they don't show up as toxic links in there - links have zero risk of penalty. Here's the thing, neither of these 3rd party services represent what google sees and thinks. Personally I pay zero attention to any of these 3rd party tools.
If you use the "site" command on the web 2.0 blogs, bookmarks and wiki sites, domains are all indexed in google with thousands of pages showing as indexed from each property. They are permanent links, 100% do follow and also indexable by google. So aside from what a 3rd party data service is saying about links being toxic, they seem to be the perfect link source.
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
1. What are you using MoneyRobot for exactly? From my experience, MoneyRobot builds toxic backlinks, from their own web farm and every MoneyRobot customer is building backlinks on the same properties and has done for the past 10 or more years.
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
hey @rastarr, how you decided MR backlinks are toxic?
Firstly, there are no limits when it comes to link building. Link velocity is not a google ranking factor, meaning it makes no difference how quickly or slowly you buld links. But obvioulsy, the quicker you build the links, the quicker you will rank.
As for the 3 tier structure, the only limit is the memory of the software and the performance of your machine to handle large volumes of projects.
The most simple 3 tier structure I use that still produces results is what I call a 1-3-9 - 3 tier pyramid.
T1 - 1 project - pointed at money site/landing page(s) T2 - 3 projects - pointed at the T1 project - results in 3 inbound links being built to each T1 link on average T3 - 9 projects - Each T2 project has 3 T3 projects pointed at it - results in each T2 having on average 3 inbound links pointed at it
If you can understand that structure, then you can start to build something bigger. The biggest pyramid that I've run on one install is a 1-10-100 which results in 10 inbound links being pointed at each link in the tier below. So every T1/T2 link has 10 inbound links each. Very powerful for rankings. To make more T2/3 links, I just reset the T2/3 projects once it's finished and run them again pointing links at the same T1 projects. If you delete target url cache/history on all projects, it will make new links on all 3 tiers, meaning new T1 links on top of the ones already sitting in the T1 verified.
If you do this 10 times, then the T1 links will end up with 100 inbound links and each of the T2 will always have 10 inbound links.
That's about as big as you can go as that template uses 111 projects and can only run one of these at a time per install. Depends on your site list as well. Small site lists will use less memory, so you might be able to run 2 of these on one install. Best to test and see what the limits are.
The other way of doing it is to build each tier separately, but it's a real pain to do it this way and requires a lot of manual labour. The above strategies are fully automated.
Doing each tier separately allows you to build a much bigger pyramid, but it's a nightmare to manage. The more T2 links you have will require even more T3 links.
sickseo i know the post is nearly a year old now but im hoping you can clear one thing up for me as i cant quite wrap my head around it. even after re reading your explanation a few times
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
Each time 1 project runs, it will create "x" amount of links. If you run 2 projects simultaneously, it will create double the amount of links in the same time frame. If you run 10 projects simultaneously, then you're now making 10 times the number of links in the same time frame.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
Hi @sickseo Welcome back and glad that you come back.
It's not possible to increase the number of links that 1 project makes, as it depends on your site list and number of sites in there. Even increasing the number of accounts/posts per site doesn't always work as planned as not every platform supports this. You have to reset the project and run it again for it to make more links.
But if you know how to make a single project make more links, I'd love to hear how you do that.
1. I'm running money robot links in the same way as any other tool. Direct to money site, as well as to power up other links. You're right about the site list being a private link farm. All owned by the software developer.
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
Thanks again for the detailed reply, mate. They are much appreciated. [1] As a newbie who started with Money Robot, SEM Rush scared the crap out of me with their toxic link report. I've dusted off my life-time license and will plug it back into powering up other links. Great to know your real life experience.
What tool produced that weekly rank-checking report, if I may ask?
[2] It would be great if @sven could/would add this sitemap ability to gsa seo indexer and make the program much more useful. Obviously some changes need to be made but it would make it an even better indexing tool. Fingers crossed
I use it for keyword research as well as rank tracking. You can have it track thousands of keywords. Will need private proxies to run the checks but you have a quota each month where they use their proxies to run checks for you.
Comments
I am looking into scrapping my own lists now though
im still unclear how adding additional separate projects creates more links than just a single project can. if we can set a single project to create however many links we want per day/total
is it a resource spread thing or what am i missing here?
thanks
I think this thread can be a sticky post as Best Practive Guide.
In T1, do you use all links that you scrape, or do you only use links with higher Domain Authority (DA) like DA 15+?
Thank you
I pay no attention to DA. It's useless as a metric. You'd think links from high DA sites is what it's all about, when in actual fact it's about the number of external links that the high DA site has as well. The higher the number of external links, the less link juice gets passed through each link, meaning you won't get as much of a ranking boost as you'd expect.
A low DA site with less external links can actually pass more link juice and have a greater impact on rankings than a higher DA site with a greater number of external links.
I also manipulate the DA of every single site in my site list by building links to the homepage automatically within my campaigns. 10% of all links are automatically sent to the homepage across all tiers. So even newly scraped sites with DA0 will automatically have the homepage powered up with links. They don't stay at DA0 for very long lol The knock on effect is that the page authority of the pages that hold my links also increase in PA as the DA increases.
So yes, I use all links that I scrape.
By pointing multiple projects at a single project, you're significantly increasing the number of inbound links a lower project will get within the same time frame versus just using one project for each tier. Nothing wrong with using just one project in each tier, but it will take longer to reach the same link numbers.
The goal is to create as many inbound links for lower tiers as quickly as possible. Having extra projects in each tier facilitates that.
Let's say one of your projects only makes 1000 links each time it runs. This is the most basic 3 tier example:
T1 - 1 project - 1000 links
T2 - 1 project - 1000 links
T3 - 1 project - 1000 links
That would create 1 inbound link to each link in the tier below. Let's look at something bigger:
T1 - 1 project - 1000 links
T2 - 3 projects - 3000 links
T3 - 9 projects - 9000 links
Now we're making 3 inbound links to each link in the tier below. Now we go even bigger:
T1 - 1 project - 1000 links
T2 - 10 projects - 10,000 links
T3 - 100 projects - 100,000 links
Now we're making 10 inbound links to each link in the tier below, which is clearly way more powerful than the first example using the most basic 3 tiers with only 1 project in each tier. Boosting DA/PA is all about links from unique domains. 1 link between 2 sites creates link juice. More inbound links = more link juice.
Also it doesn't just have to be a 3 tier structure, you can build something bigger with 5 or even 7 tiers. For more competitive keywords, you will need to use something bigger.
I saw your new website and it looks great ! goodluck with it !
can you share how many unique dofollow domains (target urls) you have?
I think this is the main "struggle" is to have more of target urls...
also, if you can share, what indexer service are you using these days.
Thanks!
My site list stats at the moment are just over 50k unique domains across all engines. Most of these are redirect and indexer links though. Probably about half of those are do follow.
The forum, article and social network sites are the really important ones for T1 and I've got just under 1000 domains that are do follow, most are gnuboard. Combined with my other tools rankerx (500 DF) and money robot (750 DF), there are just over 2000 do follow domains that I can use as T1 that support keyword anchor texts. Plus I've started to grow my own PBN network which has 100 sites now - all DA20+. These were brand new domains a few months ago at DA0.
For indexing, I abandoned indexing/crawling services as they can't cope with the volume of links I send them lol Pricing is extortionate for the volume of links I build. One service even banned me lol
I've built my own inhouse system instead using zennoposter. It gets the links crawled which is all i'm interested in. Plus it's literally crawling for an unlimited number of backlinks, which is what I needed. It certainly isn't a miracle indexing service, but crawled links still count towards ranking calculations and the system does get all links crawled by google 100%.
as always, thanks for sharing !!!
2. For your ZennoPoster system, is this method something that @sven can incorporate into GSA SEO Indexer?
I run it with 90% generics though as the templates are pretty big and it can make thousands of links pretty quick. Easy to over optimise. It's no different to what rankerx is doing. Thousands of users hammering the same group of sites. GSA SER users are doing the same thing with the built in cms - right now we're all hammering the gnuboard cms lol There was FCS networker which also did the same thing, thousands of users hammering the same group of sites. Now we have serlib which is allowing users to do the same thing. It's really not an issue at all. The only real question is "do google index these links?"
I know they show as toxic links if you use sem rush. I also use seo spyglass to monitor links but they don't show up as toxic links in there - links have zero risk of penalty. Here's the thing, neither of these 3rd party services represent what google sees and thinks. Personally I pay zero attention to any of these 3rd party tools.
If you use the "site" command on the web 2.0 blogs, bookmarks and wiki sites, domains are all indexed in google with thousands of pages showing as indexed from each property. They are permanent links, 100% do follow and also indexable by google. So aside from what a 3rd party data service is saying about links being toxic, they seem to be the perfect link source.
If the domains were deindexed from google, that's when I'd believe that google has an issue with them. But whilst google continues to index links from these sites, I'll continue to use the software for backlinks lol
That's a weekly rank checking report for one of my clients websites. It's been hammered with all 3 tools, gsa, rankerx and money robot for nearly 3 years. Whilst sem rush shows many of the backlink sources as toxic, my rank checking report shows it doesn't matter lol Between the 2 data sources I'm inclined to ignore sem rush and rely on my own data from the rank checking report. Upward movements in rankings is the only data I pay any attention to. The site is DA 50 now and growing.
Also, if these types of links really did any harm to rankings then I'd be using them to run my negative seo service for clients instead lol
2. All i'm doing is making sitemaps of backlinks and pinging them to google. So yes, it could quite easily be incorporated into gsa ser or gsa seo indexer. You would need somewhere to host the sitemap files though.
Hi @sickseo
Welcome back and glad that you come back.
I think @googlealchemist means:
Instead of:
T1 - 1 project - 1000 links / day
T2 - 3 projects - 3000 links / day
T3 - 9 projects - 9000 links / day
Maintain 13 projects.
We can:
T1 - 1 project - 1000 links / day
T2 - 1 project - 3000 links / day
T3 - 1 project - 9000 links / day
Maintain only 3 projects.
Assuming factors such as VPS speed, memory, and number of proxies etc can only create maximum of 13,000 links per day.
However, I notice that GSA running 10 projects simultanously can create a lot more links per day, compared to running only 1 project.
But if you know how to make a single project make more links, I'd love to hear how you do that.
[1] As a newbie who started with Money Robot, SEM Rush scared the crap out of me with their toxic link report.
I've dusted off my life-time license and will plug it back into powering up other links. Great to know your real life experience.
What tool produced that weekly rank-checking report, if I may ask?
[2] It would be great if @sven could/would add this sitemap ability to gsa seo indexer and make the program much more useful. Obviously some changes need to be made but it would make it an even better indexing tool. Fingers crossed
Thanks again for your insights.
I use it for keyword research as well as rank tracking. You can have it track thousands of keywords. Will need private proxies to run the checks but you have a quota each month where they use their proxies to run checks for you.