Skip to content

Let index naturally or send links through indexer for tiers ?

I have started a few 3 tiered projects recently and I been wondering what really is the best way to handle getting links found. It seems half the people say just don't do anything let it happen naturally it's best and the other half say send through an indexer. Any recommendations ? So far I have not pinged any links at all until I decide I want to. 

I figured if I should run a tier through a indexer it should probably be tier 3 maybe ? Because if those links get found then there is a chance the tier 2 would follow? In the case of tier 3 links I know because of the link types used in that typically you can run the chance of creating links that will never get found so I just assumed t3 would be my best bet to use my resources for pinger/indexing.

I have GSA Indexer too btw. 

Comments

  • run it through indexer, but drip feed them.
  • andrzejekandrzejek Polska
    edited June 2015
    You want to index (no matter how) only pages that can't be found by google spider. (ex. K2)
    You can index your links just building links to them from engines that dont create their own pages. (there are exceptions)
    But of course you can ping / index all your links...

    If  you want to be safe when pinging then usually trim to last folder in scrapebox and then ping like that. (not working on all engines).
    I personally dont ping / index almost at all (with exceptions like K2), anyway it totally depends on platform (engine) where you left your link.
    I like to keep it normal but its just me..
Sign In or Register to comment.