Skip to content

What is the best practice for tiering with gsa in 2017 and beyond


i am starting out with tiered link building and am full of doubts and guesses.
i am trying this, gsa(t2) >> web 2(t1) >> main site. with some variations

so i have few questions -
how safe is gsa at t2 ?

i have read some cases (from penguin 3.0 era) where sites lost ranking due to gsa links at tier 2 and even tier 3. i too have read that google has a certain algorithm in place to detect a site's proximity to link farms like gsa targets.

so even if we use gsa at a far lower tier, google can sniff that, though there are obvious reasons for google to tolerate it or half the internet will get penalized. i guess with penguin 4 google changed its policy from demoting spam powered sites to just ignoring spam links and neutralising it.

i am pretty sure that algorithmically gsa spam can be detected, but at level would it be safe. t2 or t3 ?

gsa(t3) > web 2(t2) > web 2(t1) > ms ?
gsa(t2) > web 2(t1) > ms ?

What are the best practices for selecting targets.

i have my own verified sitelist now. are there any critical parameters that i should filter on ?
is it important to select only those targets where the obl count is low and below a certain threshold ?
is it important to eliminate certain engines that are seen as risky ? or can i just post to any postable engine.


  • DeeeeeeeeDeeeeeeee the Americas
    edited October 2017
    I am pretty sure that algorithmically gsa spam can be detected, but at level would it be safe. t2 or t3 ?

    Interesting topic, my friend! 

    I think about SEs and spam algorithms constantly.  What may be considered "un-natural" is something I think about a lot also.

    What can and cannot be detected? I'm sure with each passing year, they narrow the gap and it gets more and more difficult.

    GSA-SER is just a toolset.

    You can use it for purely white-hat methods, you can use it for anything, really. Depending on what you do, the results will vary.

    You can blast away and spam without limit, of course, too.  (That has its place.)

    The thing is, "gsa spam" is not a real term!

    What GSA produces, and how it does so, depends on the user settings and content being posted.
    The settings allow for incredibly varied campaigns and user control.

    I think that this is more important than just saying it's safe at T2 or T3, because GSA-SER does not produce uniform results because it's used in so many different ways. Also, in my experience, different keywords seem to have different rules, if this is possible.

  • @Deeeeeeee

    i am using gsa at present for only for spam links, to boost the upper web 2 tiers and help in their indexing. and i am targeting keywords that have low competition.

  • DeeeeeeeeDeeeeeeee the Americas
    edited October 2017
    @feedpanda: For spam links, from some of the less-frequently-updated, widely circulated lists, it's probably best to use GSA in the way you are.

    I think lower Tiers, and even your target URL,  can tolerate a small percentage of spam links, directly, without causing issue. 

    But why take the risk?! And what will you gain? Even still,  I admit I've done this at times.

    It is safer to keep that distanced, altogether, everyone seems to agree at this point.

    If you have the time and space, why not test this?

    Over n% of spam links, SEs get mad. Find n. And of course, this isn't some set rule --everything seems to vary by KW.  ( least to me. Any confirmation on that in anyone else's experience?)

  • Currently, i am creating web2.0s using serengines 

    So its like 

    Money site <<<<T1 Web2.0s by human written content with serengines + contextual links <<<<< T2 Contextuals + Do follows (bad word filtered/ countres / ) <<<<<< T3 Everything except flagged cms's 

    So far it works. 

    BTW i am working on CPA niches. 

  • @fivcat
    thanks for sharing your strategy.
    which web 2 platforms do you find most effective and how many pages do you recommend creating on each.
    does T2 consist only gsa backlinks
    which cms-es are bad or should be avoided
  • may I ask what is "flagged cms's" ?
  • @andy1024

    i guess those are some cms/platforms that have been marked by google. i remember a social bookmark php script called plurk that was deindexed by google.
Sign In or Register to comment.