Skip to content

Indexing - Things I Have Learned

ronron SERLists.com
edited April 2013 in GSA SEO Indexer

I just wanted to start this thread as a lot of people have asked questions about indexing.

Just some initial comments: I am not claiming to be the expert on indexing. But getting this thing started, and the ensuing comments, will probably help people. Plus keeping it in one place will make it easier for people to find it.

Indexing - Why It Matters

Let's say you have two websites that are very equal in all respects. Site 1 has 10,000 backlinks and 5,000 of those links are indexed. Site 2 has 10,000 links and only 2,000 of those links are indexed. Assuming those backlinks are equal in all respects, I would bet big money Site 1 outranks Site 2.

Indexing is the equivalent of the 'Google Library' keeping that link (book) on their shelves. In other words, indexed links have value - they are worthy to be noted and stored by Google. The more indexed links you have aimed at your site (including all tier structures), the better you will rank.

This is why indexing matters.

How To Check On Indexing

I use Scrapebox because it is quicker, and always use public proxies. Be prepared to do several iterations to completely figure out your list as public proxies burn out fast.

I try to do this once a month, and only export the list from SER where I left off previously. That way it stays manageable.

I keep the results segregated, so I export T1 results only, T2T3 results next, and finally T1AT2AT3A as the final file.

Once I check on the indexing, I keep the results segregated again so I know what the heck I am doing. So T1-Indexed, T1- NotIndexed, etc. That way I only send files for special indexing that are not already indexed.

Pinging

I was using the GSA SER pinger for the first 4 months or so. But since I was already sending things to Lindexed automatically via the API, and since Lindexed already includes pinging as part of its process, I discontinued using SER's pinger - it was a waste of resources and was a drag on performance. Less SER functions = higher LPM.

If you use the SER pinger, I strongly encourage you - when the pinging routine starts (which you can tell by looking at the scrolling log) - to print about 5 minutes worth of the log to a textfile on your desktop. Why?

You will find that many pinging services have 'failed' remarks. I took that text file and sorted it in Excel, and discovered half of my pinging services were outright failures. Talk about a waste of time.

You only need 5-10 pinging services at most. Many would argue you only need one reliable one, and they are probably correct. Regardless, narrowing down the services will noticeably speed up your LPM.

Does pinging really work? I'm not a huge fan of pinging. I think it helps, but most people overdo it with too many services, and I would think Google pays attention to ping spammers. So take it easy on the pinging.

Natural Indexation

If you do nothing - no pinging, no services, no backlinks - some pages will index on their own. So if you plan on doing some extra legwork to help indexation, understand that the natural rate of indexing occurs over weeks and months. So things get very convoluted on whether it was natural - or assisted by some service you employed - so keep that in mind.

I believe it is in your best interest to intervene - to use some type of service to boost the indexation rate, and go beyond the natural rate.

Why do some things get indexed, and others do not? Value. What do you think is easier to get indexed: 1) an 800 word article on your topic, or 2) a forum profile with a URL? Which has more value? Which has the higher likelihood of getting indexed? Obviously #1 does. So understanding the type of links you build, and your mix of platforms, will greatly influence the %'s of indexation.

Indexing Tiers

I just ran an analysis of my indexing rate on T1, T2, T3 - my contextual tiers, and T1A, T2A, T3A - my junk-crap-kitchen sink tiers. Here's the results:

T1 = 34%

T2 & T3 = 38%

T1A T2A, T3A = 75%

So the two questions you should have is: Why is my T1, T2, T3 so low? And Why are the crap tiers so high?

The answer:

The T123 contextual properties are brand new webpages. The older ones are indexed 50% - 70%. The newer ones are indexed 0% -10%. It averages to about 36%. These pages were created out of thin air. It takes a while - and some work - to get them to index.

The T1AT2AT3A are mostly comments on existing webpages. And how did I post on these pages? Well, GSA searched Google and the other search engines. And guess what? If GSA was able to find them on a search - it means they were already indexed!!! So by definition, if you search existing pages that are in the search engines to leave a link, you already gain the value of indexing. It funnels up through your tiers to your money site.

Again, take a look at your indexation on these tiers. You will probably find similar results. What it means is that you should consider working harder to get your T1, T2 and T3 indexed because that is the hardest part.

Indexing Services

If you ping, some pages will get indexed. I would consider pinging the least effective way to get indexed.

If you add a service like Lindexed (and there are many others), your pages will get crawled by googlebots. Crawling is the first step before anything can get indexed: Crawling = Noticed. Lindexed gets your pages a 100% crawl rate. So it sure doesn't hurt. I have found Lindexed the most cost-effective way to get some extra indexation above and beyond natural indexation. You can send 50,000 links per day, and the cost is $25 per month. I know I can vouch for this service - it really helps. I also know there are competitors. Use whatever you want. Regardless of what you do, I would encourage you to use something!

I have been using GSA Indexer with a lot of success. It is best to use it on another computer, on another internet connection - when you use it on full blast. I use it on Full Indexer mode (300 threads), and only on 'sites that can accept deep links'. You end up with about 420 dofollow links and about 30 nofollow links per URL you submit. I have noticed a very nice jump in indexation with this tool - as much as a 25% boost in a couple of weeks. So for a measly $20, Sven has a great tool for you to use that will help you.

Last but not least, the most effective way on the planet to get something indexed is to build backlinks to it. But you can't keep building tiers. Otherwise what are you doing? T15, T16 LOL. If I had the discipline and the time, I would take a pure spamming tool like xrumer and plaster the links at the bottom tiers with a zillion links. Hah! And that would be the ultimate conclusion - sealing my spam with an extra heavy duty layer of spam. I don't do it, but that's what I would do if I had 40 hours in a day.

Summary

The things I focus on:

1) T1, T2, T3 - try to do everything you can to get these properties indexed

2) If on a budget, use SER Ping, but weed out the good pinging services, and only use a handful at most

3) Try to hook up with a service like Lindexed - or its competitors - it will definitely help

4) Use GSA Indexer, but try to run it separately from SER (or greatly reduce the threads in Indexer if you use it concurrently with SER)

5) Lastly, understand the types of links you are building. Pull up that pie chart, and see what you are building in the big picture sense. It will help you to understand what the heck you are doing! 

 

Comments

  • Where is the LIKE button?
  • Thanks Ron
  • edited April 2013
    +1 ron for sharing pure clarity and wisdom : ) - thank you
  • Awesome post Ron. I agree, indexing does correlate to ranks.
  • AlexRAlexR Cape Town
    @Ron - thanks for sharing. Some interesting points. 
  • Excellent share Ron
  • Cheers @ron
  • Great post @ron you are a gentleman. I'd like to add that RSS feeds are another indexing method that's both free and effective. SER has RSS capability built in guys ;)
  • @ron Thanks for the great points!
    @seagul How this rss capability works in SER
  • Again you need to do some homework if you don't know how RSS feeds work. If you know how RSS feeds can help SEO then read this:
    https://forum.gsa-online.de/discussion/994/rss-feed-creator-and-rss-feed-submit-to-aggregators/p1
    Best way to learn is to play with it (SER) that is.
  • AlexRAlexR Cape Town
    @seagul - problem with SER RSS feature is that it doesn't update feeds! This would be a fantastic feature if it did as I could then get all my links indexed but you still have to manually update the feed according to my understanding? Maybe @Sven will add a feature that auto updates the feed and allows us to create 1 feed per project/group of projects, that would be the best feature ever!
  • OzzOzz
    edited April 2013
    fantastic transcript. needs to be added to the "compiled tips"-thread.

    for your effort i painted something for you. its called "Karma le Ron" ;)
    image
  • AlexRAlexR Cape Town
    Brilliant! ROFL!  =))
  • ronron SERLists.com

    @Ozz that is hysterical! I really like the psychedelic colors.

    @seagul, that's funny because I started to write something about RSS feeds, but dropped it because I found it was a) too much a pain in the ass, b) was overkilled by the seo community for years, but more importantly c) it's already done with Lindexed and other similar providers.

  • edited April 2013
    ron is a gentleman!
  • @Ozz Brilliant picture of @ron!! Looks similar to my Chameleon Here >
    image
  • BrandonBrandon Reputation Management Pro
    Hey @Ron, I appreciate the time and energy put into this, but I was hoping for some more evidence. I haven't done the testing myself, but pure speculation isn't enough for me to invest time and server resources into heavily indexing my links rather than letting G find them whenever it can.

    Indexing - Why It Matters

    I would bet big money Site 1 outranks Site 2.


  • Your chameleon looks sad. Maybe he wants you to teach him how to use SER
  • ronron SERLists.com

    @Brandon, I don't have a white lab coat on, and I don't have the time to run tests in a lab. I need to make a living, and sometimes I just use what's between my ears to make intelligent judgments.

    For example, I do not need evidence or scientific proof to 'know' it is better to have higher PR links pointing to my site as opposed to lower PR sites.

    No one can 'know' anything with respect to Google's algorithm - perhaps the most guarded secret on the planet. Not to mention the fact that it evolves and changes every hour.

    What I can tell you from experience is that when I actively work to index my backlinks, my rankings always improve. That is all the proof I need. 

  • Great read @ronbut you should check Inspyder Backlink Monitor out for keeping tabs on your links. It can handle tired linking structures and it looks at stuff like PR, anchor text, indexing and more... quite the time saver. You can then export un-indexed with a few mouse clicks. To have an overview over links in excel and Scrapebox is a nightmare :S
  • ronron SERLists.com
    edited May 2013

    @tryggvirafn - Thanks for the heads up on Backlink Monitor. I looked at that a long time ago and forgot about it - I think it's worth a second look.

    So far It honestly hasn't been too much a hassle doing it the way I have been doing it - I kind of just let everything run and scoop up the results when it's finished.

    But I think BM makes it more of a one click deal. Checking the indexing on 100's of thousands of links can't be done with private proxies though as they get banned immediately. That's why you have to use public proxies.

    How does BM handle the proxy issue?

  • AlexRAlexR Cape Town
    Interesting idea about that tool. I also checked it out a while ago. 

    @tryggvirafn, if you have 100 projects on a VPS are you able to easily export the entire VPS into the tool or how do you import the links? If I recall you had to import them project by project which was not really an option for me.

    @Ron - been reviewing the indexing aspect and wanted to ask your thoughts on:
    1) Does it make sense to index the T0/T1 contextual links, since these have sub tiers on them and in your instance the sub tiers are getting 75% indexed, so it stands to reason that the above contextual links should get indexed as a result of the high indexing rate of sub tiers. 
    Running an indexer seems like a duplicate logic to me for these contextual links?
    (On the other hand, these have the lowest index rate, so it makes sense to run indexer on these as they are the most n.b. links! - yes I know it's a contradiction, which is why I'm asking!) 

    2) Does it make sense to run an indexer on any secondary type links (blog, image, forum, etc), since these generally get indexed since they are already in the index by Google in order to be found. Would it not be better to focus the energy on primary contextual type links?

     
  • Thanks Ron, nice post. I d like to add/correct, blind links on indexed pages doesn't mean you have gain the value of link building, pages as comments pages could get a much slower crawl/cache frequncy than new comments populates on that page, re-index quest may help some times.
  • ronron SERLists.com

    @AlexR

    1) I still do it because I found GSA Indexer to have a measurable positive effect - they're a different type of link platform. I agree that the underneath backlinks should be enough, but they don't always do the trick.

    2) I'm only running the Indexer on T1A2A3A junk links that aren't already indexed. I know it will get some of them indexed, but I am curious to see if it is worthwhile overall. I never measured the results on this tier separated from everything else - however this time I am measuring the effect. I still do it last, and when I have the extra capacity to let that many links run through the Indexer.

    @Dunce - agreed that it was a general comment, but links on already indexed pages still get crawled faster than the alternative. I can only deduce that many get crawled pretty quickly as I can see that in my rankings - and so should you. The great part about Lindexed is that I get a 100% crawl rate on all links I drop, whether on an indexed or non-indexed page - so it helps me from both sides of the equation.

  • @AlexR - yes you have to import the links project by project which is a semi-pain but I have 25 projects running all tiered with different tools and I still think it saves me a great deal of time (and of course we love software and stuff... and updates and upgrades) and it helps me with the "big picture" - index rate, anchor, etc.

    I just grab all my links (ctr+a - right click, show verified links, ctr+a, ctr+c) and paste them into each project which then sorts out all the tieres, dublicates, and dead links. Links that belong to other projects will fall under the dead category. It then runs through them checking on various things. I then repeat for the other projects. I do this once a month so its not really an issue.

    They have some kind of limited free trial here:http://www.inspyder.com/products/BacklinkMonitor/Default.aspx

    Anyway, enough about that - @ron tell us more nifty secrets about indexing!
  • ronron SERLists.com

    tryggvirafn  You still have to answer my question from above:

    "How does BM handle the proxy issue?"

     

  • BM uses proxies @ron
  • ronron SERLists.com
    Thanks @seagul. It would even cooler if it sucked up public proxies on its own to run on, but I can accept that.
  • edited May 2013
    I haven't fired it up in a while but I'll check. I used to use it with proxy goblin when PG worked well.
  • edited May 2013
    Yes you can import proxies from a file. So if you populate the file with fresh proxies from a scraper you can run public proxies with BM. Like I said I did this with proxy goblin before I had SER. Back in my SENUKE days........ :ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!:ar! :ar! :ar! :ar!
  • I just ran a test on a tier 1 project.  Like all projects i feed the links automatically to lindexed.

    This particular project had just 1 tier 2 (kitchen sink) pointing at it.

    I was at 22% indexed 4 days ago.

    I took all the unindexed links ran then in GEO Indexer (Full mode) took 2 days to index about 1k links.  I checked indexing today and I'm at 50% verified.

  • ronron SERLists.com
    It's the truth - totally worth it. I spent 5 days indexing links last week with Indexer.
  • oh yes, GSA Indexer works I can vouch for that!
  • ronron SERLists.com
    One of my tricks with public proxies is I take my spare copy of SER, turn it on, and have it write the public proxies to a file on my desktop. I then use that file of PP to use in Scrapebox to do my indexing check. It works very well.
  • Dumb question on this that I couldn't find with a quick search.  Do you remove the indexed links from SER once verified or leave them accumulating more backlinks?
  • Sorry @ron but for some reason I did not get notified when you mention my name... so I totally missed the discussion after my last reply.

    Mr. @seagul is correct, it uses proxies but does not scrape so you need your own source. 
  • I've also been using Backlink Monitor. Just drop everything in it sorts into tiers and checks link is live, domain and page PR and indexed or not.

    I also use mine to send non-indexed to Indexifciation.com using api.

    Great tool.
  • @Mitch Ah, you've read Tops threads?
  • @seagul just realized I posted this comment from page one not seeing page 2 was here.

    What is/are the Tops threads?
  • Tops from senuke forum. I call him the oracle. Sorry I thought you knew him because you mention that forum on another thread
  • Sorry with you now!

    Head in GSA and ScrapeBox and brain in the fridge!

    I actually got put on to Backlink Monitor by Matthew Woodward, he did a great review/video on his site.
  • Ah yes another fellow Brit. I haven't used BM in a while maybe I should fire it back up. I seem to have loads of tools but only use a few every day. Anyway I mention Tops because I learned a great deal from his in depth threads and videos. In fact he was the most helpful guy in the SE forum. A bit like @ron here LOL.
  • @seagul - BM has been updated several times in the last few months and sense I bought it last year it has changed quite a bit.  
  • edited May 2013
    This is what i always like to share when people are talking about indexing, lindexed, crawling etc:

    image

    Old but true. Stare at "F-" for a few minutes and you know why you need your kitchen sinks/index tiers or whatever you want to call it. You can replace "unimportant" by "not enough" .. 

    Best Regards
  • AlexRAlexR Cape Town
    @Startrip - is there any way to see which index it's stored in? Like is it in B,C,D,F? 
  • I don't think there is one...
  • @Ron have you found answers to this?

    "2) I'm only running the Indexer on T1A2A3A junk links that aren't already indexed. I know it will get some of them indexed, but I am curious to see if it is worthwhile overall. I never measured the results on this tierseparated from everything else - however this time I am measuring the effect. I still do it last, and when I have the extra capacity to let that many links run through the Indexer."

    I was about to add indexing to my overall strategy and thought I'd start with T123, but seems like you have a different approach.

    When you said

    "1) T1, T2, T3 - try to do everything you can to get these properties indexed"

    Do you feed links from T123 to indexer services directly or lower tiers so in turn higher tiers are found?



  • ronron SERLists.com

    For me, every verified link, regardless of tier, first goes to lindexed. In effect, that helps all contextual tiers.

    Anything I do with GSA Indexer is a separate deal, performed every so often (when I have extra time), to give an extra boost. The contextual tiers are my primary concern. Those are all brand new pages.

    Most of the links on the kitchen sinks tiers are indexed quickly as most of those links are some type of comment on already existing/indexed websites - as I have documented. 

  • Greetings All!

    I tried GSA Indexer for the first time earlier today. I didn't know what to expect out of it. I've been using Indexification - now I'm considering dropping that subscription and just using the GSA Indexer. For me it's more than enough to get my work done and save me monthly subscriber cash as well. I push local business sites in my area only. So you can imagine I don't even use half of my subscription's 50,000 links per day to index limit.

    I added a new client's domain to host just this morning after getting his design ready. His domain is a few years old. When I checked his links Google was only seeing 1-literally 1. After I launched his new design online I used GSA Indexer to start linking right away. I checked a little while ago and his links on Google are now at about 171 or so: ThunarElectric.com

    I'm new here. I don't know if this has already come up, if so, I apologize in advance, but it would be great if the private proxies we use in GSA Search Engine Ranker could be used simultaneously running it and the GSA Indexer?
  • donchinodonchino https://pbn.solutions
    A quick question.. So far I have sent only do-follow links to GSA SEO Indexer (using it as the only indexer), but I try to make do- and no-follow links 50/50. Should I send no-follows as well or am I doing the right thing?
  • davbeldavbel UK
    edited August 2013
    @Chris you can use proxies with indexer.  Click the down arrow next the to the grern arrow and set them up just as you do with SER
  • ronron SERLists.com
    @donchino All inks are important - send all links to your indexer.
  • donchinodonchino https://pbn.solutions
    @ron Thanks, I was thinking the same thing...

    I'd anxiously want to click the "Index Check" under verified links list, but I want to be sure.. does it use proxies? Should it use proxies in the first place? I am just wondering how safe it is.. or if it's gonna check all the links in index with my vps ip and leave footprints..?
  • ronron SERLists.com

    I never use SER to check indexing because it would ban my private proxies. If you do it, use public proxies only. I use Scrapebox with public proxies myself because I track indexing outside of SER. 

  • AlexRAlexR Cape Town
    @ron - another gem of a tip. Use public proxies for checking indexing. Well said. 
  • AlexRAlexR Cape Town
    @sven - I know there are many choices as to when to use proxies, but using public proxies for index checking would be quite neat. 
  • edited August 2013
    Sorry I had missed your reply davbel - just barely saw that... I appreciate the feedback, Thank you.
  • @ron - I was starting to do the same for checking if my imported targets where indexed, but then I started to check if when SB said they were not indexed, that was actually the case.. Manually checking the targets in Google showed that many were actually indexed - I tried 20 and only 1 was actually not indexed.. Any ideas why? Do you do a custom check, or is it literally using SB's index check?
  • davbeldavbel UK
    edited September 2013
    @ron don't you get a lot of false negatives using public proxies for index checking?

    Sorry, I just re-read that and I meant as @jjumpm2 mentioned that do you get a lot of sites reported as not indexed when they are?
Sign In or Register to comment.