Skip to content

Get your links indexed - [Some Ideas]

Hey,

i know a lot of people suffer the problem of not getting their links indexed, especially after the recent updates. Call it Penguin 2.0 or whatever, but what most people noticed is the following:

- less links get indexed
- indexing is slower
- proxies get banned faster and therefore scraping targets gets harder

None of my pages really got "hit" by recent updates in a way that they lost -500 in rankings, but what i observed soon was a big loss of links in total.

In automated linkbuilding it is pretty normal that you build links and lose them. If you're doing it the lazy way and build your T1 with SER (what most people here do for the sake of easyness), there is no such thing as a "safe" T1. You will lose links after a while. Your useful content gets deleted, the page goes offline, gets deindexed etc.

What counts is that you have a steady steam of fresh links to replace these and also gain links in total. Lose 90, build 100...

Most people here count on tiered linkbuilding (TLB) to do this, and so do I.

The problem with TLB is (and has always been), that the top of your pyramid gets extremely big at some point and it becomes almost impossible to build multiple links to each of the Top-Tiers. If you do the math (for for the 10-10-10 blueprint most people use), you will see that you need millions of BL per project after a while, which is almost impossible.

The other problem is, that while you're building a bigger T1, you constantly lose T2, T3 etc. links too.

Imagine a pyramid where the top gets bigger all the time, while a little goblin with a Google-T-Shirt constantly takes away stones from the base. While the top grows, your constantly trying to repair the base to safe that damn thing from falling together...

After some time you reach a critical point where your hands will bleed, your back will hurt and you cant keep up anymore.

Some of my older projects were suffering this fate last month, and since they became real cashcows over time i didnt want to lose them. They already started dropping in the SERPs.

Since i didn't find many new targets that i didnt already parse to repair the base of the pyramid, i first ticked that "continously try to post to a page even..." in my contextual T2 and created a catchall email for these (so that i can create infinite accounts).

The outcome was, that SER built a LOT new contextual links on platforms where it already did. But who cares? 30 Articles on the same page, but all pointing to different T1 Articles. I dont think G will filter this. So a lot less targets are needed.

But obviously this wasnt enough, the T1 still got indexed extremely slow...

I did an index-check on my T1 after few weeks and saw sth like 200/800 indexed. A lot of them were extremely old, like 3-4 month. The new ones after Penguin 2.0 had an indexing-rate of maybe 5-10%. 

I decided to delete all indexed T1's at that point. I know this is counter-intuitive since the indexed ones at least pass link-juice. But i checked the amount of BL for some of these properties and they already had around 50-100 each.

The idea i had in mind was, that with this i get more links to the non-indexed links and therefore get them indexed.

I had to decide between the concept of link-juice and indexing. I think it's right that contextual properties pass linkjuice and over time maybe PR to your moneysite. But is it normal for a 700-word article on a drupal blog to have 300 backlinks? I dont think so. The amount of link-juice these properties can pass is limited.

Imho it is enough to build links to them until they get indexed (they should have a bunch at that point) and then forget about them (which means deleting them). Instead of building hundreds and thousands of links to some Buddypress-Property, these ressources should be used to get new links indexed.

And guess what, this really worked out for me. I didnt change anything (except these 2 things) and now i see many new indexed links popping up - and i start gaining links in total instead of losing.

I think these tipps could be of value for some of you. 

Today i even considered to delete every verified link in my T1 that is older than 30 days. If they dont get indexed after they received a bunch of T2 and T1A links, they most likely never will. Or maybe they will over time, but you dont need to build additional links for that.

I hope i can help some people with this and I'd appreciate if you share your ideas and thoughts about that :)

Forgot to mention that i also disabled lindexed a few weeks ago (not because they are bad but because i think it's too obvious for the SE's) and im not using any indexing service any more, like pinging etc.

Best Regards

Comments

  • AlexRAlexR Cape Town
    @Startrip - great post. So timeous! I have been researching this aspect and wondering about it too. I've found I have a very low index rate and its driving me scatty trying to solve it. 

    I just want to say that what you said was brilliant "have a limit of T2 links that get built to the T1 links" so that at least it spreads out the links. 

    An interesting thing I've been reading is just because a link is not indexed doesn't mean it doesn't have any value. Still need to see if this is true, but it was from a respectable person on BHW who mentioned it...
  • @AlexR >>> An interesting thing I've been reading is just because a link is not indexed doesn't mean it doesn't have any value

    Thats true. There are different steps in "indexing" and I can confirm that even "nonindexed" links count. They appear in GWT even if they are not indexed. 

    Here is a picture by SEOmoz explaining the process (it's backengineered, but it sounds very logic)

    But especially for contextual properties it is important to get them indexed, because otherwise they dont pass any linkjuice and u just get a "raw" link of low value (at leasts that's my theory).

    Cheers,
    Sebs
  • AlexRAlexR Cape Town
    If your content for the contextual links is unique, articles are longer than 600 words, readability good, 1 link out to your site - surely this should get naturally indexed?

    I've got a set of manual web2.0's I'm reviewing and I found out of 20 I think 5 or so were indexed despite being a few pages. This is very strange...will need to do some digging as it seems there's a lot of BS on the forums about indexing from people who aren't checking what's actually going on. 

    As you said, just because it has a tier doesn't mean anything, and I'm pretty sure G is looking at how to neutralise tiering if they haven't done so already. 
  • edited July 2013
    >>> If your content for the contextual links is unique, articles are longer than 600 words, readability good, 1 link out to your site - surely this should get naturally indexed?

    I cant say, because I'm not using readable content. I use text 700 words+ that i create with a macro, 1 link in random place, where the grammar is "ok", but it makes no sense. It's like "%Keyword% the bathroom king is printer the light and said convinced lazarus". I cancelled Kontent Machine because the outcome wasnt really better when i spun heavily, and often i had double words like "the the". I'm not paying 37 a month for that. I really recommend using a macro and scraped, topic relevant content. I posted a macro somewhere here and developed an advanced one from that. That way you get unreadable crap, but its unique. G cant detect it, at least not in this century.

     Uniqueness still beats quality imho. At least the basics of grammar should be correct (not 3 nouns in a row or sth), but almost anything goes.

    I got the same problem with web 2.0s, I created some manually and used handspun content (readable and making sense), some of them still took month to get indexed, one popped up now after almost 6 month ... (blogya). That's why I dont use them any more and go for heavy spam.

    You're right, there is a lot of BS about indexing, and even more BS about "SEO". Find sth that works and scale it. No need to overanalyze why it works ;)

    And i can't think of a solution for Google to fight tiering, because tiering is what happens naturally on the internet. 
  • AlexRAlexR Cape Town
    Just did a index check on 50 unique 700 word articles for a project that are all verified and I got 0/50 indexed in Google! Yes, they're pretty new but that's very very low!

    Yes it has a few backlinks to them but these are great articles. 

    Will test out a new technique on them and see how it goes. 
  • edited July 2013
    Ok this is a very interesting thread.

    How are you both checking the indexing?  in SER or Scrapebox or?


  • Startrip how you setup macro in articles that use random places?
  • edited July 2013
    @baba i have 5 different templates, one is chosen randomly and the links are in different places
    @ambition in SER but dont think that matters since it only "googles" the URL.
  • What happens when you check if links are indexed in SER with google banned proxies ?
  • AlexRAlexR Cape Town
    @sawa73 - great point! I will check that out. My proxies are registering as passed so it should be fine, but will need to double check!
Sign In or Register to comment.