Skip to content

Scalability of tiered link building

I'm calling @LeeG @ron and others who have experience in using GSA SER for large scale tiered link building and/or maximizing LPM. But everyone feel free to chime in.

Now, I've been using a similar setup, inspired by @Ozz here:


I still struggle to get over 35 LPM constantly, but that was not the issue here. I could read other threads for that. There are ample of information here already.

Every day, looking at my stats, I always feel my setup fails to build enough links to satisfy each tier. Even when I only use three tiers instead of what Ozz and ron use (4 tiers).

Until today when I read a thread started by ron (You can find it here https://forum.gsa-online.de/discussion/2930/ser-tiers), it came to me perhaps I should know the figures better, so I fired up Excel and did quick formula as follow. I'm never good at math, so correct me if I'm wrong.

image

The above screenshot was made based on the following:

- Number of T1 links per day: 10
- Verification percentage: 10 percent (conservatively) for each tier
- Number of T2 links per day: x20 per URL
- Number of T3 links per day: x20 per URL
- Number of T4 links per day: x20 per URL
- Time before verification: 1 day
- No T1A, T2A, etc just contextual T1, T2, T3 and kitchen sink T4 (T3A).

Note: In ron's strategy, T3A is assumed to be T4, because it links to T3.

Look at the figure (highlighted in blue).

At day 30, while still submitting 10 links to money site, the number of total links for T1 has grown to 29.

Total submission required for T2: 560. Total verified for T2: 756.

Total submission required for T3: 14,040. Total verified for T3: 11,700.

Total submission required for T4: 208,000.

Remember, this is only for one money site URL (T0). At the end of second month, at this rate, SER should be submitting to millions of links every day to satisfy the 20x per URL option for tier 3 links.

This is a problem everyone adopting tiered link building will face sooner rather than later. Even for such efficient setup as LeeG, an instance of SER can barely keep up with it, if I want to meet all links for each tier, every day.

In other words, even when everyone is using the same exact setup and options, a small decision such as how many projects I run per instance of SER will determine if my T2 and T3 links will get enough juice, or not at all for that day. And in turn that will impact indexation and ranking.

So help me please. How do you cope with this issue:

1) Just let SER does its best. In this case, I have a few questions: how does "Use verified URL of..." work? Does it pick URLs in sequence or random after each submission, or does it move to the next URL only after it has built 20 links for current URL? If the latter, how do you make sure the latest links are given their share of lower tiered links? Maybe @Sven can chime in and clarify how this feature behaves?

2) Use indexer service / software to help with indexing issues. (Assuming besides sending more power to upper tiered links, the purpose of lower tiered links is also to help getting them indexed.)

3) Re-verify every so often to minimize waste. Build links only to verified links that are still there. (I know at least ron does this every week.)

4) Delete verified URLs every so often so SER has more time to build links to new URLs. If so, how many days? How many links per upper tiered link before you stop and move on?

As a side note: Perhaps SER should have an option to make deletion of URLs older than xx days / weeks easier?

I can imagine this option becomes an important step. Perhaps I should move on after each T2 and T3 links got 30 verified links each (30 days) and delete them afterwards to give chances to newer links. Anyone's doing it?

5) Scale to another VPS / dedicated. If this is part of your strategy, how many projects and URLs do you run per instance of SER?

6) I think ron mentioned in another thread about keeping a spreadsheet of verified links from day to day. That is a great idea to keep track of progress. If too many projects are not progressing, in terms of growing verified links per day, then perhaps it's time to optimize first, then scale vertically / horizontally. (Request for "submitted for the day" and "verified for the day" columns beside each project, I think, is a great idea. https://forum.gsa-online.de/discussion/1147/new-feature-submitted-today-verified-today-by-project It will help in this case too, among others, as pointed out by ron in the thread.)

7) Other advice or ideas?

I know, the number is not etched in stone. Tiered link building is not perfect. Links are deleted, not every link should have specified amount of inbound links to avoid footprints, etc... but I'm not asking for exact figure either but some guidelines, hints, tips, anything to make it more cost effective and profitable for all of us.

Let's make this thread an interesting discussion for "scaling GSA SER". (I apologize for the long post.)
«1

Comments

  • ronron SERLists.com
    edited May 2013

    Wayyyyy too long of a post for starters.

    I'll just cut to the chase. The number of links you build, and the number that are verified, have everything to do with the platforms and engines you select.

    A long time ago, I had engines submitting tens of thousands of links with only 100 verified. I probably use only 1/3 of all the engines to make links. The other engines are way too inefficient probably because they are moderated or whatever.

    You must compared the data on verified links vs. submitted links. That is where your problems lie.

    You are getting into all this other stuff which really is not the issue.

  • Hey Ron,

    Would you mind walking through how you go about making the comparison between submitted and verified to weed out inefficient engines? I've been looking for a little while now and have yet to see a viable way of doing it, i guess i'm missing something. If you've already covered this in another post then apologies

  • ronron SERLists.com
    edited May 2013

    Go to Options>Advanced>Tools>Show Stats>Submitted/Verified

    Just export the data to a spreadsheet. Get both on the same page. Play with the spacing so that the same engines are on the same rows. Then on each row, divide verified by submitted.

    You now have %'s of verified for each engine. You also have raw data on the absolute number of verified for each engine. I look at both of these numbers before deciding what are the worthless engines.

    I have made the mistake of cutting engines from projects, only to find they magically got improved by @sven...So always have some projects where you run the 'bad engines' in some throw away projects just to gather data  - I have had to reverse myself a few times with cut engines.   

  • Excellent thanks Ron, you're a gentleman :)

    Will have a play with it and see how the numbers look, surprised to see that my submits are up to a total of nearly 1 million already and verified around the 150k mark, guess that's nothing compared to the amount you're churning out lol
  • ronron SERLists.com
    That's more than enough to get a very good read on your efficient, productive engines.
  • @ron I recall a few tricks that I mentioned here by LeeG, plus the one you brought up here in this thread here:


    But somehow I forgot to include that, but that would make my post even longer.

    Frankly this post is not for merely about increasing LPM or verified, but perhaps I should try to get each answer one by one.

    Thanks for stopping by. I appreciate your effort.
  • Okay I guess I will try this again @ron

    The above table (screenshot) is not my result, but more of a prediction about how many links SER has to submit) and the amount verified. In short, look at the highlighted cells in blue.

    If I build 10 links per day to money page, 20x per verified URL per day to T2, T3, and T3A (T4), at Day 30, SER has to submit 208,000 times successfully just for that ONE url. At Day 60, SER has to submit successfully more than 2.2 million times for T4 alone for that single money page.

    How do you solve this issue, and make sure each URL gets its own share of lower tiered links?

  • ronron SERLists.com
    edited May 2013

    You really don't solve that issue. No one can. All you can do is keep trimming the dead links, and make the top of the tree as perfect as you can so you don't waste link building underneath and SER resources. The biggest part of not wasting SER resources is what I said - make sure you use the most productive engines.

    What happens with all those tiers underneath? The actual plan doesn't require 20X links literally. It sure would be nice, but no software can do that for all your projects. SER handles it to the best of its ability.

    Everything you wrote is correct on all of your points. Setting tiers up properly is the majority of the battle. If you are using scheduler, you can handle an unlimited number of projects. The problem is that each project only gets a certain amount of sunshine. You have to be able to make the decision on when you need to start expanding to another VPS based on how many links you think each project needs.

    There are no rules of thumb because each website demands a different amount of links to rank and stay ranked.

    Expansion to another VPS and additional copies of SER and CB is truly a minor consideration. If you are making money, and it's time to expand, you have about $200 of one-time GSA software cost and $50 per month for a VPS. That's a very small cost for expansion.

    But as you plod along, you also need to cut out your crap. Quit hanging on to loser websites. Quit building links and wasting SER resources on websites that aren't worth a damn. Learn to quit wasting time breathing life into the dead, and start creating new websites. Only a few seeds that you plant will really take root. That's the way this business works.

    And after all this trimming and adding, you might find yourself only needing the same VPS and the same copies of SER and CB. But slowly you evolve to making 5X the money you were making just one year ago with the exact same set of tools, and the exact same VPS. Then you expand for real, not because your ego demands it, but because you truly have reached your limit with one SER, one CB and one VPS.  

  • Once again Ron lays it out exactly how it is down to a T. 
  • @Hunar, as someone who's running multiple VPSes and SERs at the same time, I was hoping you can give some tips.

    @ron Thanks. I guess this is the part where business decision has to be made although I hope it to be more methodological.

    I mean, many people here mentioned they're running 25 projects or so at the same time with 1 SER. But all those T2 and T3 links may not get enough "sunshine", as you said, hence the indexation is low, and rankings fail to climb. (I know many reasons why pages are not indexed, but I meant, this is one issue: not getting enough links to the page.)

    It is "possible" that it was not that we can't rank the keywords, but more of failure to use SER properly. I mean, if only we run it 2-3 projects at a time, so all of them got enough share sunshine, then move on after all of them are proven worthy and unworthy, the perhaps some of those keywords that we should've thrown out without proper use or SER will grow into stronger tree.

    You know, if not getting enough opportunity, a talented kid may not grow into what s/he should, right?

    But of course, using one instance of SER to do this will take too much time. Imagine being able to test only 3 keywords every 3 months. 

    That is why I want this thread to be about how to make sure each keyword / URL gets fair share of sunshine. And at the same time, run as many keyword / URL campaigns as my capital / time can afford.

    I repeat again, I know this is not etched in stone, but if for those who run multiple VPSes and SER, I know there's something to share that we can all learn here.

    So far what I learned is to let SER do its job. Use some of the tips from other threads as well as what I summarized above to make sure SER is building links to verified links that are still there. Then check for indexation and perhaps use services / software to help some of the links indexed.

    I really like your feature request https://forum.gsa-online.de/discussion/1147/new-feature-submitted-today-verified-today-by-project for adding submitted today and verified today fields. At least those are quick indicators if projects are still getting their share of sunshine. If projects begin to lag far behind every day, after all the optimization, it's time to scale horizontally.

    But yeah, I can see your point, how you approach your projects. I need more feedback from others about their approach.
  • AlexRAlexR Cape Town
    Great thread! Appreciate the time you have taken. 

    For me I focus on trying to get some higher quality at the start, rather than volume. Takes a little longer but requires less resources. As you mentioned the exponential numbers are just too great. 

    There are many webmasters who are ranking sites using manual methods, and I think we need to look closely how they do it, and then automate that. If they can rank using manual submission, we should be able to match them and exceed them. But there is no webmaster that is manually creating 100 000's of links...the time it would take is just unrealistic. So we need to find some sort of middle ground here. 
  • One thing I forgot to mention, I prefer checking for backlink data with Majestic SEO than ahrefs. The former has more complete data. Usually 2-6 times more links found than ahrefs. Besides keeping spreadsheet for submissions / verification total from day to day, I also watch the stats from Majestic SEO from time to time.

    If instead of growing, my links decrease and link discovery slows down... I know there is bottleneck somewhere. Usually that's when ranking gets stuck or lowered.

    @AlexR: Exactly. This is the kind of mindset I approach SEO too. I was looking high and low for software that could duplicate manual Web 2.0 creations for some time. Tried a few ZennoPoster bots. Basically I want something that can post to Web 2.0 sites repeatedly, maintaining a growing number of network of free blogs that I can tap into on demand, delete default posts, change themes, insert images and videos (those are usually what separate manual vs automatic Web 2.0 submissions.)

    We using SER are building better T3 links though, I think. Many SEO are using forum profiles and SB blasts only for T3 to get T2 links indexed, while we are using dozens of different platforms for variety. SB blasts still help though, because they're likely have been indexed by Google so that helps indexation.

    They have been getting great results from that, so there's no reason we couldn't. Just need a bit of patience since time and link velocity are becoming more important nowadays.

    For now I'm building 3 tiers instead of 4 to keep them manageable. Quality over quantity. Better to have 5x 10x 10x rather than 30x but struggle to build enough links to satisfy lower tiered links and get many of them indexed, then end up with 30x 2x 2x 3x.

    But then, most of my sites are very new. I refuse to use this on my authority site (since 2005) unless I'm absolutely certain there's no harm now and in the future. (I can't, that's why I always hesitate.)


  • AlexRAlexR Cape Town
    Regarding web2.0 I have a similar approach. I am hoping that the feature to use SER engines with a repost scheduler (i.e. post a new article once per week to each web2.0) will be added to SER. 
  • @AlexR Don't put high hopes for that, I'd say. There is one paid service here that I know of, plus one that's in development that support growing Web 2.0 platforms. They require too much work to maintain, unless someone wants to do it and contribute to GSA.
  • AlexRAlexR Cape Town
    I'm pretty sure a feature is in the mix for something like that with web2.0's and SER Engines. Will need @sven to confirm? 
  • Yea, s/he is requesting for approval too. Once approved, will be in business. You can get 25 fixes for already existing engines for free. Just Google to find it. Haven't tested it though, didn't know if they work.
  • @audioguy: Why you don't have much hope regarding a feature like a "repost scheduler"? I think its hard to accomplish but I bet Sven will come up with a feature like that sooner or later.

    <.... delete default posts, change themes, insert images and videos

    those things are already happen when you are using SERengines. to insert images and videos you need to customize your articles.


  • Oh don't get me wrong. Free as part of SER, not a service, I should add.

    I'm aware of the service. But still, it requires too much work. I see it first hand with other services using ZennoPoster (Bought ZP just for that reason, so you know I'm very serious). There are many ways it could break.

    Say 50% registered successfully, fluctuates from day to day depends on changes, etc. Some fail to verify. Other fails during deletion of default posts, change of themes, and the steps in between. Others don't support images or videos so you end up posting garbled code.

    After that, it may post successfully the first time, but 50% fail again to repost. At the end from 100 platforms supported, perhaps only 7-8 free blog platforms you can post to regularly. But again platforms change, so some I can no longer maintain for weeks, while other new ones were supported for that week, etc.

    In one service I'm using, I got so frustrated that I suggest the author to stop adding platforms to make the service looks good but instead support like 2 dozen platforms very well. That will make the service gold. So if existing services are proven to make that feature flawless, even for like 20 platforms, consider me a lifetime customer.

  • Well as far as tips from me for running multiple dedi's.   Ron basically said exactly what i've done.  I began running SER on my home comp.  Till I started making money and Had more campaigns than my home computer could handle. Got 1 VPS   did more than it could handle  than got another  I went till i got 4 VPS's once I got more than those could handle I traded in the 4 VPS's for a dedi.    And I just keep going like that.  I have 7 Dedi's now and 3 VPS's  the VPS are for Client/SEO related stuff.   the 7 Dedi's are for my stuff

    If you read Ron's stuff He basically gives you the blueprint for making money Minus the keyword research.  Once you've got that down + what Ron has said you are pretty much set

    That's why I think this forum is better than a lot of the others I've read.  If I read the kind of information that ron and others have given like 3-4 years ago instead of piecing everything together myself It would be a totally different game for me right now.
  • @Hunar Thanks for your insight. I learn a lot. I already have 2 VPSes right now, so your setup is kind of my inspiration. How do you figure out it's time to move on to a dedi? Please share more details :)

    I grasp both of your approach. But please understand that is only one strategy (a good one that I learn a lot for my own, btw)

    So I'm doing more of a mixed strategy right now, and really only in the early stage, so that's why I need more info to prepare me for what's ahead.

    Obviously both of you are adopting micro or mini niche strategy. Spend perhaps $20-$50 for content per site, churn out as many of them per week, and promote. Kill ones that flop after 1-3 months. Focus on the rest. Keep adding more to your portfolio of sites. Some suggest if I hit successful niches, scale them. Good advice.

    I'm doing exactly that for short term. But for long term, I'm aiming for authority sites, those which require $500-$1000 initial funding (mainly for content) to get up and running. I wish to develop 12-20 of those over 2-3 years. Right now I'm starting with 3. 

    Now, obviously I can't kill such a project as easily. They are my babies. If you're a father, you know you don't just make more babies and hope some of them become successful. You put what you have to take care of them, you make sure they got their best share of sunshine, opportunities or whatever you want to call it.

    Of course, this is still business. I'm in no way attached to any of them and I'll be using sound business decision if any one them proven to be a flop. But this is why I started this thread.

    If I have to buy a VPS and SER for each site to make sure they have enough links for each tier, that's what I will do. But before that, I want to make sure I'm maximizing every resource that I have. From optimizing options to get the most LPM, to indexation, re-verify to get fresh and live links, etc. And a few tips about how to recognize it's time to scale.

    Based on the figure I posted in the original post, I realize it will happen sooner than most people would realize, if they want to build tiered links properly.

    Once it's working, then it's an easy decision to scale a tight operation. So yes, I'm going to throw things at the wall and hope some of them sticks for short term. But I hope to maximize what I got for my long term projects. I guess I can do better than that for them.

    So please if anyone has experience, do share.
  • AlexRAlexR Cape Town
    @Hunar - how many primary anchors do you use per campaign and how many secondary anchors?
  • @audio figuring out when you need another is pretty self explanatory the server can't handle more. :)

    @alex It all depends on the amount of competing pages and the niche i am going for.  I'm always changing them or sometimes I don't even use secondary anchor text.    it literally depends on the other sites on the first page and the competing pages.   IE Competitive niches I obviously have to be a bit more aggressive.  not so competitive niches I can kinda ease up on it. 
  • Lol. Call me dense, but it obviously isn't for me. Based on the figure I posted above, running SER for 30-60 days for one URL should exhaust all the limit for one VPS, no matter how much I optimize my projects.

    I know perhaps it is best to make decisions based on ROI, if it makes me more money than the cost in 3-6 months, it should be a good investment.

    I still hope there's other approach but thanks. Another assurance definitely helps. Coming from a technical background, it's rather hard for me to dive into entrepreneurship where you have to make this kind of decision frequently.
  • And I just noticed you still learn new tricks with GSA even with such an operation. It's a push for me to take the plunge whenever my situation's necessary and not over analyzing. :)
  • @Hunar getting up to 7 dedi's mean you have a lot of sites. Just interested on how big they are, are you building small 2-5 page sites or bigger?
  • Another epic thread evolving here  :bz
  • @Hunar, it just occurred to me we have discussed this before. Reading too many threads on this forum! Please Ignore!
  • @Velocity: The answer you're looking for is here https://forum.gsa-online.de/discussion/comment/18276/#Comment_18276 @Hunar has over 500 sites. Much more now I'd imagine.
  • May I add that as many of your links die and others never get indexed as long as you are pro active with trimming these out then it frees up resources to build tiers to only the best links
  • @seagul thanks. Exactly how often do you prune your links? Do you also delete older links to allow more time for newer ones?
Sign In or Register to comment.