Skip to content

Filtering De-indexed URLs

edited October 2012 in Other / Mixed
Hello,

I'm wondering if GSA can check if links listed in identified/success/verified has been de-indexed?  I know scrapebox can do it but does GSA have something similar?

Comments

  • s4nt0ss4nt0s Houston, Texas
    It can check if your verified URL's are indexed by right clicking on a project > show URLS > verified > Index check.

    You might be able to create a project and import the URL's into the verified section then run the index check. There is no index check build into the global site lists section.

  • AlexRAlexR Cape Town
    Wouldn't that be a neat feature? An index check on the global sitelists? OR - before it posts to the sitelist URL it runs a check to see if it's indexed?
  • SvenSven www.GSA-Online.de
    I wouldn't want to add this there as it would slow down things a lot. And it's not really useful in my eyes. If a link is placed on a deindexed site might be a wast of time but I don't think it hurts your ranking!?
  • Thanks s4nt0s.. I'll try your advice out.

    I agree with doing index checks before posting during an active job would slow things down a lot.  Which is why I'm thinking doing index checks before running a job so I always have a clean success/verified list to work with.  Scrapebox has a separate module to do this.

    Sven nailed my assumptions on posting on deindexing site as I'm thinking posting on deindexed sites would affect ranking.  I read some posts on BHW posts in regards to posting on deindexed sites which is why asked here to see if there is a way to avoid it using GSA.
  • I second the notion of checking domain for deindexing BEFORE posting there. 

    It is a penalty in thoughput, but I want the option of paying that penalty.  What negative link effects exist now or will in the future?

    I side with syuusuke.

  • Surely that would slow things down massively? Also for the fact, what would you use to actually check whether the domain was deindexed, proxies? Your own IP? Surely these would get banned eventually as well anyway due to the number of requests there making? Not in favour......
  • edited November 2012
    I think there is a difference between index checking and checking the server response.

    As for checking deindexed sites that could be in the global list before you post, they don't
    hurt you in fact they work the same as any other link, this was noticed when they deindexed the blog networks, but it's rare you would scrape a deindexed site from Google as it wouldn't be there in the first place. If you're importing a list then clean it yourself before hand with scrapebox.

    Your article could be indexed in google but the site owner then deleted the articles.

    What you need is a server response check to check the header 404, 301, 302, 501, 503, 200 etc - then get rid of all the bad eggs - this is for cleaning out a list for your 2nd tier to make sure your building to a list that actually exists. This is a better feature IMO.. not index checking > getrequests.

    You could add it to the show urls>verified, then have an option of removing them when complete.
  • AlexRAlexR Cape Town
    Interesting idea about only running through a server response on the sitelists. Some of these lists are really big and this would be a good way to neaten them up! 
Sign In or Register to comment.