Skip to content

SER Made Contextual Indexing Rates Since 1st July 16

shaunshaun https://www.youtube.com/ShaunMarrs
edited July 2016 in GSA Search Engine Ranker
Afternoon guys,

Has anyone else noticed a change in their indexing rates from the past few weeks? Not 100% sure the date it started but it seemed fine at the end of last month. I have gone from being able to get around 75% of SER made contextual links indexed and maintaining that number to getting around 25% indexed then it just drops off over time to at best 5%.

I have been testing a crap ton of indexing services to try find a solution but none are working.

I'm not sure if Google have released another indexing algo patch as they did around this time last year, it seems that the two phases of indexing links are now much stricter. The initial phase where google does its checks and then decides if the link gets indexed or not. Then a constant ongoing check on links that are in their index and this seems to just be kicking new SER contextuals out of the index like theres no tomorrow :(.

Strange thing is, pages on the exact same domains are indexed and holding steady but new pages on those same domains have the effect i'm describing. 

Initial thoughts are Google have updated their index algo and are able to kick spammed sites out their index easier. As I have said I have been testing a fair few indexing services out there along with testing auto spun, human spun and manually spun content. I plan to use ReCaptcha solvers when sven has investigated a few things I have reported with CB to try get a set of less spammed domains to try my systems on but other than that i'm at a loss :(.

Anyone out there tracking their indexing rates of contextuals made by SER this month and able to share them and some light on the methods they use to index?
«1

Comments

  • shaunshaun https://www.youtube.com/ShaunMarrs
    I was hoping to get some input on this, had a few PMs and it it is widespread.

    I have tested my old indexing method on some web 2.0s and they are holding steady right now leading me to beleive it is something on the domain level Google have changed.
  • redraysredrays Las Vegas
    @shaun - appreciate you digging into this and sharing it here. I took a look at my own data and those of a few friends, and I can also confirm that indexing rates have taken a hit. None of us were getting the 75% success rate you were (not surprising, you know your shit), but all of us have seen some decrease.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited July 2016
    @redrays I have had a little free time today so been asking around on Skype with people I know who are heavy SER uses and they are all reporting the same thing :(.

    One guy has his own in house ubot system, pretty sure its something along the lines of authority indexer pro and he is getting 30% initial index then its falling too.

    Pretty worried tbh, I know there has been people saying how SER is useless or dead for years but there has always been things to tweak. My personal rules for T1/T2 is the links have to be contextual, indexed and do follow. Dont care how I get them but if a link is not indexed then its useless.

    The time maybe near to move over to PBNs and expired web 2.0s :(, I have a plan for SEREngines as T1/T2 and SER non contextuals for T3 but not sure how having 100% links from Web 2 will work, never tried it.
  • Thanks for the info, shaun. Keep us up to date. No sense spending money and resources on SER if links aren't getting indexed. Total bummer.
  • redraysredrays Las Vegas
    @shaun - thanks for the additional info. I do think leaning heavily (but not 100%) on Web 2 will work. Honestly it's what I've been wanting to do for a while, just haven't found a decent piece of software to make it reality. Right now SER's main job is boosting linking root domains, and I think there are other ways to get that :)
  • If you have a bit of money, it's still a good idea to just buy a bunch of great PBNs. Not great to start with tho if you have no moneys.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Busy day testing, I have decided all but two of the indexing services I am testing are useless so stopped wasting time on them. The two that seem to work are both heading the same way as all my other tests with rapidly falling indexing rates but I will keep checking them.

    Web 2.0 are holding strong, a fair few test batches are doing the opposite of the SER based platforms with their indexing rates increasing.

    Did some "site:domain" checks on some of my random drupals list, I will track the number of pages Google has indexed in the domain and when it increases I plan to reverse engineer the new pages. Spoke to Sven today and he has found the bug in CB that was making it massively cost inefficient to try get better domains for SER to post to on its existing platforms. Hoping to have it fixed in the next few days then I will throw some cash into building a list of verified domains that are protected by ReCaptcha, kCaptcha and Mollom for testing.

    Not really any further forward but trying to develop a direction to go in. If the web 2.0 thing holds then its good new for SEREngines V2 although I have been using RX to build my test web 2 batches, its improved a fair bit since I stopped using it but still lacks basic features and thats where SEREngines will take the advantage.

    Anyway I will check in over the weekend with any testing I do guys :).
  • redraysredrays Las Vegas
    Cheers @shaun, good stuff as always :)

    I hadn't thought of "building a list of verified domains that are protected by ReCaptcha, kCaptcha and Mollom", but it makes sense based on what I think is probably causing these indexing problems - restrictions on how many pages a domains can index based on some quality metric and/or restrictions that start if a bunch of new pages start appearing all at once.
  • Make sure to check also meta robots of the page, where SER submitted an article. Before I found some of the sites that had new page set to "noindex"
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @redrays yea I was thinking something similar. It wouldent be too hard for Google to add some also for traffic per page or something like that. For example Buzzfeed getting say 10,000 per new page where as some random SER domain getting nothing per new page. Protects the true authority sites and ditches the burners.

    @anonymous thats a fair point actually mate, never thought of checking stuff like that. One thing is though they index then drop off over time so the no follow thing would have to be added after a few hours or something by the web master.
  • UPTOWNUPTOWN UPTOWN
    I am not native english speaker so sorry if i misunderstood. as i understand you are building backlinks to your money site with ser ? i can say web2.0 getting low index but still it works. What i do is creating web2.0s with nukeTNG  and boosting tier links with ser. also crowd search really helps.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @UPTOWN yea I have started a fair few web 2.0 tests now and they seem to be doing well. Already working on a new web 2 based method.



    So again another day with a fair bit of testing.

    It's seeming like the pages with 100% auto spun content from Kontent Machine made with SER are maintaining a higher index rate than pages made with human sentence level spun content like the leading articles and even my own manual word spun content. 

    Thinking there might be something about uniqueness of content even if it is not readable. I have ran a few of the pages through Grammarly and it is picking up loads of errors in the content. I always though that if Grammarly can find the errors then Google could but it seems they are still indexing it.

    I can remember reading about a content snippit patient Google had filed and i'm wondering if that is what they have added to their algo. The math side of what was publically available made no sense to me but a few guys were explaining it in the thread over on BHW and from what I understood it was based around comparing content at a much smaller level such as sentence v sentence or even something like 5 words v 5 words to detect duplicate content then working out an over all duplicate percentage for that page against the other stuff the google bot had processed so if your percentage was small enough for stuff like legitimate quotes and such then it would still get indexed but anything over a threshold would be kicked out.

    That being said though all of the web 2.0 pages seem to be holding well with all three types of content so its frustrating me a little.

    Also if it is a content thing then its going to be a nightmare for my plan of getting targets protected by ReCaptcha and such as i'm presuming the content will need to be human readable due to possible human moderation on the domain but I have not actually tried this yet so we will see how it goes when the CB patch is released. I was planning to use ultra spins from the leading articles but they are only spun to sentence level while still being human readable and cost effective.


  • redraysredrays Las Vegas
    @shaun - I do large scale tests with heavily duplicated content. Although it's not SER related, I feel that the lessons are somewhat applicable here. Anyway, sometime in the early spring my indexing rates dropped to like 10% of what they had been for the 4-5 previous months. The old content was 90-95% duplicate and still indexed just fine. I fixed the problem by knocking the % of heavily duplicated content down to 50-60% with the remaining 40-50% being heavily spun stuff from KM.

    As for creating human readable content, I bet you could get away with doing 1000 word articles where about 200-400 words are heavily spun and the remaining 600-800 are duplicate content. Stick 100-200 words at the beginning and end, the duplicate stuff in the middle, and I think you'll be good.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @redrays thats not a bad idea actually I will try knock something up over the weekend for testing.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    So more testing, it seems @Grax1 has the closes thing to a working indexing service, tracking 40% after a few days, there have been a few dips but seems steady now. Hope to god it holds!

    My web 2.0 tests went a little mental last night and their rate dropped too but seem to have returned. Auto spun content is definatley indexing better than human or manual spun. I have chased Sven up as the bug fix wasnt pushed and I wanted to get building my new verified list this weekend.

    So I was talking to a friend on skype who used to use SER before he moved over to ubot and his own custom CMS posters about my plan to build a better verified list of targets protected by ReCaptcha. He kind of revealed a little about what he is doing and said he doesnt touch contextual domains that have weak captcha protection and his indexing rates only dropped a little bit. He uses high quality manual spun content too with crazy high uniqueness so it seems the way to go and im looking forward to giving it a try :).
  • Can you name the indexing services you use?

    I've had a few PBNs get deindexed, really annoyed, but shit happens. 

    I've got a few new sites that just aren't getting indexed (or the homepage has and no inner pages) and I've added new pages to old sites and they aren't getting indexed (probs about 7 weeks ago I added the pages).

    Need to figure this out. Good old sape still works though ;)
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited July 2016

    Tried all of these looking for a solution but dropped them all as they were totally useless, some were through friends accounts and getting them to submit the link batches for me but they have dropped the services now too.

    http://www.lindexed.com/

    Whatever @grax1 did is holding around 40% indexed. 

    I spoke to Sven today and the new update for CB that was pushed today has fixed the two bugs in CB that I needed to start building my new verification list but I can't get one of the features to work to track the success rate of the service as similar to indexing many of them are just scammy cash grabs much like the indexing services so I need to talk to Sven a little more to get it to work or im going to drop the idea of the higher quality verified list test.

    I have been thinking over the weekend and I'm starting to lean towards 100% PBN/expired web 2.0 stuff and dropping SER completely.
  • @shaun Omg man,what are you talking about,dropping SER,thats crazy talk..dont make me loose my shit over this,is it really that bad?
  • @shaun thanks mate, I use ExpressIndexer.

    I haven't used SER since early May I think it was. I went headfirst into PBNs. I've had a couple deindexed but had some amazing results too. I am going to use SER to power them up as I find that using indexers on aged/dropped domains that have been reinvented is a slow process to index nowadays, I don't mind really as it's almost like the links from my PBNs are dripfed to my money sites.

    I loved SER, but I hate using servers and all of the other add-ons that it needs to make it work super-fast, especially when I wasn't seeing positive results towards the end. I have a new strategy that I want to try out but if that doesn't work then SER might be just one of those tools I have in my HDD for future use. Shame as I have quite a few licenses. I CBA to hire someone to crack platforms and make bespoke engines for me, either. That's just not something I want to spend time on but I'm sure it's the way forward to keep away from the masses that use the same sites.



  • Do you scrape your own targets or are you using a list?  You're probably scraping but just wondering?  I can say this, I started scraping and have had better results. I think it's really easy for google to filter spammed to death properties now.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Vlad The point of a link is to power up whatever it is pointing at, if google doesn't even see a reason to have a link indexed then what kind of benefit can that link offer? My personal testing has shown time and time again that nofollow or unindexed pages offer no benefit.

    I have one server paid until the end of next month, I think I will work on it till then and if I can't work out a way to get the contextual indexed then drop it. Grax1 is planning to launch his service soon so I will see what kind of volume it can deal with. I also have a web 2 based method that uses SER to some extent so if that can get pages to the top three then I will keep it too.

    @judderman I used to love express indexer, then I dropped it for my own method. My own method stopped working then went back to it and it no longer works, no idea how long it has been unservicable for.

    I have nothing on this week so I am putting most my time into PBNs and a variation of the PBN method along with expired web 2.0s as everyone I speak to using them seems to be killing it. SER stuff is taking a back seat as I dont want to start any new projects without a working method for it. Looking forward to SEREngines but if I have dropped SER by the time its released and have moved to PBNs then they will take all my time.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Seo_Gladiator  I do both, I scrape my own contextual targets but just use a list for non contextuals now so I feel it is more efficient. My scraping rig is finding expired web 2.0s right now though not SER stuff but I still have a list over just over 1000 contextuals with around 200-500 stick rate on them.
  • shaun i use elitelinkindexer  have you tried it? Pretty awesome for me. 50% or higher indexing rate in 24 hours for me.
  • shaun here is elitelinkindexer.com thread on blackhatworld


    not sure if links fall back off, i will test to find out.  I was blown away that 60% of my contextuals where indexed in less than 24 hours.  Scraped contextuals i might add, not from a list. 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Seo_Gladiator I havent actually mate, when I get home later I will put $10 in to test it though. Got my fingers crossed :) thanks for the heads up.
  • shaun It's the best I have seen and I actually tried authoritylinkindexer too. It's not bad, just the cost and buying accounts is a pain in the ass.  So for me elitelinkindexer is where it's at.. You're welcome. God knows you have helped me enough.
  • 710fla710fla ★ #1 GSA SER VERIFIED LIST serpgrow.com
    I've been keeping up with this thread and have a question.

    So building tier 2 links no longer keep the tier 1 contextuals indexed?

    You're saying they initially get indexed and Google later de indexes those pages?

    I'm thinking of posting to all my tier 1 contextual target links and checking their index status after a week.

    Then only keeping domains still indexed and deleting the rest until I build up a decent list.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Seo_Gladiator  just signed up :)

    @710fla They just deindex. Its as if they arnt meeting some kind of quality score or something so they get kicked out of the index.

    I have just checked the test batch from grax1 and its actually upto 60% indexed. Can't guarantee he isn't just reprocessing the links each day to keep the index count up but it seems to be working and gives me hope :). Maybe some indexing methods counter the patch some how or something. Looking forward to the results of Elite Link Indexer over the next few days too.
  • redraysredrays Las Vegas
    I wouldn't say that tier 2 links no longer keep tier 1 contextuals indexed. I think it's still better than nothing, just less effective than it was even a month or so ago.

    @710fla - I like this idea. Might be a good idea to make multiple posts per domain as well.

    Interesting stuff @shaun and @JudderMan. Appreciate the sharing :)
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Seo_Gladiator  testing Elite Link Indexer I put four batches of URLs into it and only one has been processed the other three are in the queue. Like for 12 hours.....is this normal?

    That being said the processed batch has an initial index rate of 60%. Looking forward to see how it tracks.

    Also, myself and santos were talking about a little feature he thought of that shouldent be too hard to add to SER but might help counter this indexing problem.

    Also, I have now started building my premium verified list of domains protected by ReCaptcha and stuff for testing.
Sign In or Register to comment.