Skip to content

Indexing - Things I Have Learned

ronron SERLists.com
edited April 2013 in GSA SEO Indexer

I just wanted to start this thread as a lot of people have asked questions about indexing.

Just some initial comments: I am not claiming to be the expert on indexing. But getting this thing started, and the ensuing comments, will probably help people. Plus keeping it in one place will make it easier for people to find it.

Indexing - Why It Matters

Let's say you have two websites that are very equal in all respects. Site 1 has 10,000 backlinks and 5,000 of those links are indexed. Site 2 has 10,000 links and only 2,000 of those links are indexed. Assuming those backlinks are equal in all respects, I would bet big money Site 1 outranks Site 2.

Indexing is the equivalent of the 'Google Library' keeping that link (book) on their shelves. In other words, indexed links have value - they are worthy to be noted and stored by Google. The more indexed links you have aimed at your site (including all tier structures), the better you will rank.

This is why indexing matters.

How To Check On Indexing

I use Scrapebox because it is quicker, and always use public proxies. Be prepared to do several iterations to completely figure out your list as public proxies burn out fast.

I try to do this once a month, and only export the list from SER where I left off previously. That way it stays manageable.

I keep the results segregated, so I export T1 results only, T2T3 results next, and finally T1AT2AT3A as the final file.

Once I check on the indexing, I keep the results segregated again so I know what the heck I am doing. So T1-Indexed, T1- NotIndexed, etc. That way I only send files for special indexing that are not already indexed.

Pinging

I was using the GSA SER pinger for the first 4 months or so. But since I was already sending things to Lindexed automatically via the API, and since Lindexed already includes pinging as part of its process, I discontinued using SER's pinger - it was a waste of resources and was a drag on performance. Less SER functions = higher LPM.

If you use the SER pinger, I strongly encourage you - when the pinging routine starts (which you can tell by looking at the scrolling log) - to print about 5 minutes worth of the log to a textfile on your desktop. Why?

You will find that many pinging services have 'failed' remarks. I took that text file and sorted it in Excel, and discovered half of my pinging services were outright failures. Talk about a waste of time.

You only need 5-10 pinging services at most. Many would argue you only need one reliable one, and they are probably correct. Regardless, narrowing down the services will noticeably speed up your LPM.

Does pinging really work? I'm not a huge fan of pinging. I think it helps, but most people overdo it with too many services, and I would think Google pays attention to ping spammers. So take it easy on the pinging.

Natural Indexation

If you do nothing - no pinging, no services, no backlinks - some pages will index on their own. So if you plan on doing some extra legwork to help indexation, understand that the natural rate of indexing occurs over weeks and months. So things get very convoluted on whether it was natural - or assisted by some service you employed - so keep that in mind.

I believe it is in your best interest to intervene - to use some type of service to boost the indexation rate, and go beyond the natural rate.

Why do some things get indexed, and others do not? Value. What do you think is easier to get indexed: 1) an 800 word article on your topic, or 2) a forum profile with a URL? Which has more value? Which has the higher likelihood of getting indexed? Obviously #1 does. So understanding the type of links you build, and your mix of platforms, will greatly influence the %'s of indexation.

Indexing Tiers

I just ran an analysis of my indexing rate on T1, T2, T3 - my contextual tiers, and T1A, T2A, T3A - my junk-crap-kitchen sink tiers. Here's the results:

T1 = 34%

T2 & T3 = 38%

T1A T2A, T3A = 75%

So the two questions you should have is: Why is my T1, T2, T3 so low? And Why are the crap tiers so high?

The answer:

The T123 contextual properties are brand new webpages. The older ones are indexed 50% - 70%. The newer ones are indexed 0% -10%. It averages to about 36%. These pages were created out of thin air. It takes a while - and some work - to get them to index.

The T1AT2AT3A are mostly comments on existing webpages. And how did I post on these pages? Well, GSA searched Google and the other search engines. And guess what? If GSA was able to find them on a search - it means they were already indexed!!! So by definition, if you search existing pages that are in the search engines to leave a link, you already gain the value of indexing. It funnels up through your tiers to your money site.

Again, take a look at your indexation on these tiers. You will probably find similar results. What it means is that you should consider working harder to get your T1, T2 and T3 indexed because that is the hardest part.

Indexing Services

If you ping, some pages will get indexed. I would consider pinging the least effective way to get indexed.

If you add a service like Lindexed (and there are many others), your pages will get crawled by googlebots. Crawling is the first step before anything can get indexed: Crawling = Noticed. Lindexed gets your pages a 100% crawl rate. So it sure doesn't hurt. I have found Lindexed the most cost-effective way to get some extra indexation above and beyond natural indexation. You can send 50,000 links per day, and the cost is $25 per month. I know I can vouch for this service - it really helps. I also know there are competitors. Use whatever you want. Regardless of what you do, I would encourage you to use something!

I have been using GSA Indexer with a lot of success. It is best to use it on another computer, on another internet connection - when you use it on full blast. I use it on Full Indexer mode (300 threads), and only on 'sites that can accept deep links'. You end up with about 420 dofollow links and about 30 nofollow links per URL you submit. I have noticed a very nice jump in indexation with this tool - as much as a 25% boost in a couple of weeks. So for a measly $20, Sven has a great tool for you to use that will help you.

Last but not least, the most effective way on the planet to get something indexed is to build backlinks to it. But you can't keep building tiers. Otherwise what are you doing? T15, T16 LOL. If I had the discipline and the time, I would take a pure spamming tool like xrumer and plaster the links at the bottom tiers with a zillion links. Hah! And that would be the ultimate conclusion - sealing my spam with an extra heavy duty layer of spam. I don't do it, but that's what I would do if I had 40 hours in a day.

Summary

The things I focus on:

1) T1, T2, T3 - try to do everything you can to get these properties indexed

2) If on a budget, use SER Ping, but weed out the good pinging services, and only use a handful at most

3) Try to hook up with a service like Lindexed - or its competitors - it will definitely help

4) Use GSA Indexer, but try to run it separately from SER (or greatly reduce the threads in Indexer if you use it concurrently with SER)

5) Lastly, understand the types of links you are building. Pull up that pie chart, and see what you are building in the big picture sense. It will help you to understand what the heck you are doing! 

 

«13

Comments

  • Where is the LIKE button?
  • Thanks Ron
  • edited April 2013
    +1 ron for sharing pure clarity and wisdom : ) - thank you
  • Awesome post Ron. I agree, indexing does correlate to ranks.
  • AlexRAlexR Cape Town
    @Ron - thanks for sharing. Some interesting points. 
  • Excellent share Ron
  • Cheers @ron
  • Great post @ron you are a gentleman. I'd like to add that RSS feeds are another indexing method that's both free and effective. SER has RSS capability built in guys ;)
  • @ron Thanks for the great points!
    @seagul How this rss capability works in SER
  • Again you need to do some homework if you don't know how RSS feeds work. If you know how RSS feeds can help SEO then read this:
    https://forum.gsa-online.de/discussion/994/rss-feed-creator-and-rss-feed-submit-to-aggregators/p1
    Best way to learn is to play with it (SER) that is.
  • AlexRAlexR Cape Town
    @seagul - problem with SER RSS feature is that it doesn't update feeds! This would be a fantastic feature if it did as I could then get all my links indexed but you still have to manually update the feed according to my understanding? Maybe @Sven will add a feature that auto updates the feed and allows us to create 1 feed per project/group of projects, that would be the best feature ever!
  • OzzOzz
    edited April 2013
    fantastic transcript. needs to be added to the "compiled tips"-thread.

    for your effort i painted something for you. its called "Karma le Ron" ;)
    image
  • AlexRAlexR Cape Town
    Brilliant! ROFL!  =))
  • ronron SERLists.com

    @Ozz that is hysterical! I really like the psychedelic colors.

    @seagul, that's funny because I started to write something about RSS feeds, but dropped it because I found it was a) too much a pain in the ass, b) was overkilled by the seo community for years, but more importantly c) it's already done with Lindexed and other similar providers.

  • edited April 2013
    ron is a gentleman!
  • @Ozz Brilliant picture of @ron!! Looks similar to my Chameleon Here >
    image
  • BrandonBrandon Reputation Management Pro
    Hey @Ron, I appreciate the time and energy put into this, but I was hoping for some more evidence. I haven't done the testing myself, but pure speculation isn't enough for me to invest time and server resources into heavily indexing my links rather than letting G find them whenever it can.

    Indexing - Why It Matters

    I would bet big money Site 1 outranks Site 2.


  • Your chameleon looks sad. Maybe he wants you to teach him how to use SER
  • ronron SERLists.com

    @Brandon, I don't have a white lab coat on, and I don't have the time to run tests in a lab. I need to make a living, and sometimes I just use what's between my ears to make intelligent judgments.

    For example, I do not need evidence or scientific proof to 'know' it is better to have higher PR links pointing to my site as opposed to lower PR sites.

    No one can 'know' anything with respect to Google's algorithm - perhaps the most guarded secret on the planet. Not to mention the fact that it evolves and changes every hour.

    What I can tell you from experience is that when I actively work to index my backlinks, my rankings always improve. That is all the proof I need. 

  • Great read @ronbut you should check Inspyder Backlink Monitor out for keeping tabs on your links. It can handle tired linking structures and it looks at stuff like PR, anchor text, indexing and more... quite the time saver. You can then export un-indexed with a few mouse clicks. To have an overview over links in excel and Scrapebox is a nightmare :S
  • ronron SERLists.com
    edited May 2013

    @tryggvirafn - Thanks for the heads up on Backlink Monitor. I looked at that a long time ago and forgot about it - I think it's worth a second look.

    So far It honestly hasn't been too much a hassle doing it the way I have been doing it - I kind of just let everything run and scoop up the results when it's finished.

    But I think BM makes it more of a one click deal. Checking the indexing on 100's of thousands of links can't be done with private proxies though as they get banned immediately. That's why you have to use public proxies.

    How does BM handle the proxy issue?

  • AlexRAlexR Cape Town
    Interesting idea about that tool. I also checked it out a while ago. 

    @tryggvirafn, if you have 100 projects on a VPS are you able to easily export the entire VPS into the tool or how do you import the links? If I recall you had to import them project by project which was not really an option for me.

    @Ron - been reviewing the indexing aspect and wanted to ask your thoughts on:
    1) Does it make sense to index the T0/T1 contextual links, since these have sub tiers on them and in your instance the sub tiers are getting 75% indexed, so it stands to reason that the above contextual links should get indexed as a result of the high indexing rate of sub tiers. 
    Running an indexer seems like a duplicate logic to me for these contextual links?
    (On the other hand, these have the lowest index rate, so it makes sense to run indexer on these as they are the most n.b. links! - yes I know it's a contradiction, which is why I'm asking!) 

    2) Does it make sense to run an indexer on any secondary type links (blog, image, forum, etc), since these generally get indexed since they are already in the index by Google in order to be found. Would it not be better to focus the energy on primary contextual type links?

     
  • Thanks Ron, nice post. I d like to add/correct, blind links on indexed pages doesn't mean you have gain the value of link building, pages as comments pages could get a much slower crawl/cache frequncy than new comments populates on that page, re-index quest may help some times.
  • ronron SERLists.com

    @AlexR

    1) I still do it because I found GSA Indexer to have a measurable positive effect - they're a different type of link platform. I agree that the underneath backlinks should be enough, but they don't always do the trick.

    2) I'm only running the Indexer on T1A2A3A junk links that aren't already indexed. I know it will get some of them indexed, but I am curious to see if it is worthwhile overall. I never measured the results on this tier separated from everything else - however this time I am measuring the effect. I still do it last, and when I have the extra capacity to let that many links run through the Indexer.

    @Dunce - agreed that it was a general comment, but links on already indexed pages still get crawled faster than the alternative. I can only deduce that many get crawled pretty quickly as I can see that in my rankings - and so should you. The great part about Lindexed is that I get a 100% crawl rate on all links I drop, whether on an indexed or non-indexed page - so it helps me from both sides of the equation.

  • @AlexR - yes you have to import the links project by project which is a semi-pain but I have 25 projects running all tiered with different tools and I still think it saves me a great deal of time (and of course we love software and stuff... and updates and upgrades) and it helps me with the "big picture" - index rate, anchor, etc.

    I just grab all my links (ctr+a - right click, show verified links, ctr+a, ctr+c) and paste them into each project which then sorts out all the tieres, dublicates, and dead links. Links that belong to other projects will fall under the dead category. It then runs through them checking on various things. I then repeat for the other projects. I do this once a month so its not really an issue.

    They have some kind of limited free trial here:http://www.inspyder.com/products/BacklinkMonitor/Default.aspx

    Anyway, enough about that - @ron tell us more nifty secrets about indexing!
  • ronron SERLists.com

    tryggvirafn  You still have to answer my question from above:

    "How does BM handle the proxy issue?"

     

  • BM uses proxies @ron
  • ronron SERLists.com
    Thanks @seagul. It would even cooler if it sucked up public proxies on its own to run on, but I can accept that.
  • edited May 2013
    I haven't fired it up in a while but I'll check. I used to use it with proxy goblin when PG worked well.
  • edited May 2013
    Yes you can import proxies from a file. So if you populate the file with fresh proxies from a scraper you can run public proxies with BM. Like I said I did this with proxy goblin before I had SER. Back in my SENUKE days........ :ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!
Sign In or Register to comment.