Skip to content

3 tiers and not even a link indexed

i have been running SER for 5 days now, created Tier 1 and another 2 tiers just to index tier 1 (i am sending tier 1 to my web 2.0 and PBNs) but when i check google index. There isn't a single link indexed!!
I am using a hand spun content, and  using contextual links only in all tiers. I even created a separate indexing project to try to index any link from the 3 tiers with no success.
Should i wait more? I have checked the root domains and they are all indexed, and even pages like the one i have posted.

Comments

  • SvenSven www.GSA-Online.de
    5 days is not really a long time for google to index things today. Give it some more time.
  • OK Sven, i wanted to be sure. Thanks
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @georgecesar That's just the way indexing is now mate.

    Make a new test project to build links to google.com, let SER rip building links out as much as you like sending them all to your indexer or building a T2 to them to try index them.

    After 24 hours index check them with scrapebox and repeat the index check every 24 hours noting down your results each time. I have been doing similar stuff to this for over a month now and the majority of the time it goes like the below example, for example sake we will use 100 test URLs to help make it easy....

    Day 0 - 0 indexed
    Day 1 - 40-50 indexed
    Day 2 - 40 indexed
    Day 3 - 35-40 indexed
    Day 4 - 30-35 indexed
    Day 5 - 25-30 indexed
    Day 6 - 20-25 indexed
    Day 7 - 15-20 indexed

    Some times it holds around 15% indexed depending on your list and index service provider but usually it seemed to drop to around 5% indexed with most services, I stopped my index checking other than three test batches with elite link indexer about a week to 10 days ago as I was sick of seeing the same or very similar things.
  • edited September 2016
    thanks @shaun this is great idea. i just don't want to waste a lot of resources as i am on budget now. 
    I stopped using SER for one year, but tried to use it again these days as i have lost some ranking, and wanted to boost my newly created PPNs and Web 2.0

    If the index rates haven't improved, i will get back to using PPNs & web 2.0 only without any tiers.

    Note: 
    I have learned a lot from all your comments. Thanks man.
  • @shaun also i forgot to mention that i am scraping my own targets, and building lists using the expansion links method (url extractor of scrapebox)
    Do you think it is worth it to continue scraping my targets, or just buy lists, and may use these lists to find new targets?
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @georgecesar if you are trying to save cash build 25 do follow blog, image, guestbook, trackback to each target you need indexing per day.

    Its the cheapest method around right now and it has a similar or sometimes better rate than the "indexing services".

    In all honesty I have no idea about the buy/self scrape thing right now. Too much is changing with how you need to use SER right now and in my oppinion it is still way up in the air.

    My previous thinking was to scrape as many of your own contextuals as possible but because they dont index anymore I have pretty much dropped that and demoted SER to T3 non contextuals only as they should already be indexed. I have no dramas buying those as people will find them anyway as they are so easy to get.

  • @shaun Well i think i will keep running tiers with do-follow links and check again after few days. I will stop running tier 1 projects now to see first if the links will be indexed or not.

    Thanks a lot
  • @georgecesar . you may give Loopline a try. He has the best reputation right now. http://www.autoapprovemarketplace.com/ 
  • @antonearn thanks a lot, didn't know he is selling lists, will try it.
  • You're welcome
  • antonearnantonearn Earth
    edited September 2016
    "if you are trying to save cash build 25 do follow blog, image, guestbook, trackback to each target you need indexing per day." 

    Haha, wasn't you recently posting some pics from your SER run with 5,000 + links in the secondary link setups @shaun :D 
    SEO evolves faster than the speed of light... 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @antonearn I posted some photos a few month back the 300 lpm pic was building out a T2 and T3 I think and the 1000 lpm pic was just testing.
  • Okey, I get it 
  • @georgecesar - why not try pointing decent quality non-contextuals at your web 2.0s?
  • @antonearn - the advice that @shaun gave in that thread was spot on at the time and gave a few months of extra life to SER contextuals. But yeah, everything evolves very quickly. Based on what I've seen, the problem is that it's gotten very difficult to index a lot of new pages on low authority domains over the past 6-7 months. Engines like Joomla K2 were hit first, and now it seems to be everything. I'm running into this problem with SER and with all other tools / strategies that I use.
  • manubossmanuboss https://seorankhigher.net/service/
    Build backlinks to backlinks, give juice and help index, don't waste money in indexer service anymore
  • antonearnantonearn Earth
    edited September 2016
    I think its really strange, but from what I've seen in Kontent Machine, it does not create any H1, H2, H3?
    Just the titel? So maybe Google looking into that content quality factor a lot more. I bet you also insert a random authority link in your articles in Kontent Machine, for creating that trust boost? Why not increase the number of contextual platforms in SER? It should be a lot more platforms out there? @sven
  • The majority of what scrapebox shows as indexed cannot be verified manually. Urls with special characters are easy to wrongly verify as indexed.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited September 2016
    @antonearn ideally for a new contextual platform to be added it first needs to be in wide spread use so we have targets, then ideally it will offer contextual article submission rather than contextual profile submissions, then it would have an easy to solve captcha as its default so people leave that in, then ideally it would auto approve submited articles by default and finally be indexable.

    You can adjust them slightly to try increase the verified urls at the end but there just arnt many platforms out there like that.

    KM has never added headers and the owner doesnt seem to have any plans to add it. I doubt that is what google is using to keep SER contextuals out of the index. Personally I think it is something to do with the number of trusted links pointing to a domain v the number of random outbound links and number of pages it is getting per day.

    I was talking to a good friend on Skype the other day and he reminded me of a conversation that we had about two or three years ago and he predicted this exact thing and if this is how they are doing it then he nailed Googles true plan pretty much the day the announced the disavow feature. 

    His theory is that Google had an alternative plan to the disavow tool, they sell it as a way for web masters to have google disregard a link pointing to their site if the web master uses their tool. He thinks that what they have actually done is turn all the guys who call themselves white hat into their minions and if a domain has x number of disavows against it then its pages become harder and harder to index.

    Its not hard to find noobs out there who wernt able to rank their websites so think they accidently negatively SEOed their sites so they start selling negative SEO services and all they do is blast a target site with SER to help their client hurt a competitor. In reality in my oppinion proper negative SEO is much harder that that but the thing is many web masters see the inbound links coming in and disavow them. I dont use web master tools but apparently it will tell you of an inbound link once crawled even if it was not indexed. 

    I don't subscribe to the whole add links to authority sites in your articles to get trust crap. Google are smart enough to know exactly how easy it is to get bots to randomly add additional links to the articles they build and I havent added third party links for ages now. It wastes link juice and slows everything down. Additionally all the people who let tools like SER add random links to third party to the articles created are linking to sites whos web masters may also be using the disavow tool on these links not only having the page you created slapped and all the links in the lower tiers to it made pointless and you wasted so many resources building them all but it is also increasing the number of disavow hits a domain gets.

    So yea, even though we spoke about it just the other day it never really made much sense to me until I just typed it out there so I guess that could be how they doing it. I am pretty impressed he predicted this so quickly lol.



  • I'm sure the disavow tool is used in some interesting ways, but is it really that complicated? It used to be ridiculously easy to index tens or hundreds of thousands of pages of heavily duplicated content on brand new domains. Not only that, this would happen very fast. Around April of this year it started getting much harder, and since then it gets harder every day. This matches up well to when I started struggling with SER contextuals, and those domains tend to be low / no authority as well.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    No idea mate but they have done something to help their AI, last time they did it was the ReCaptcha words where their AI couldent do it so they made ReCaptcha so humans would type the words in for their paper to digital book conversion thing so its not like they havent tricked people into doing stuff their bots cant before.
  • antonearnantonearn Earth
    edited September 2016
    A lot of backlink talk. Do your sites deserves to rank? Dont get me wrong. Google has put a lot more emphasis on the user experience/authority sites. Its not like before were you could choose one kw, smash a site up with 10 indexed pages and rank #1. Look at your competition and to twice what they do. Do they have 1000 word on the homepage? Do 2000 on yours. And dont target one keyword. Ive found that the top guys has a lot of content, everywhere. Go to semrush, see what kws they rank for, for each of their pages. Use all of those kws on your pages aswell. Topic relevance is key. One major factor Ive found has worked very well slso is update your pages once a week. Add some synonyms and dont forget to use a plugin so all of your blog posts link to eachother. Internlinking is huge and crucial for transfer juice all over. And heres the golden nugget: start using 2016 and/or 2017 in your backlinks. Try and you'll notice what happens ;)
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @antonearn hmm most of the reads like someone who has read quicksprout and moz a few times but never actually tried to go against it to see what happens.
  • What do you mean? 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    edited September 2016
    "Do your sites deserves to rank?"

    This thread is not about ranking, its about indexing. I have some T1's propping money sites up that have 100% spun crap on them ranking in the top 30 for the target keyword of the moneysite, they don't deserve to rank or be in the index but not only are they indexed, they are in the top 30.....I know some people who put auto spun crap on their money sites as place holder content and only put real content on it if the site ranks to keep costs down too. The pages that are in the top 30 were made before I noticed the indexing changes a few month back but I still use the same stratergy with RX and it manages to get pages indexed and remain in the index.

    "Google has put a lot more emphasis on the user experience/authority sites. Its not like before were you could choose one kw, smash a site up with 10 indexed pages and rank #1. "

    The site I am testing out PBNs with has 6 pages, 1 is the main review article and links to 4 smaller product specific reviews and it has a privacy policy. It has most of its keywords on the first page after around 2 weeks. It is far from an authority site and has what I would class as a bad user experience as it is designed to get them off my site and to Amazon ASAP.

    "Look at your competition and to twice what they do. Do they have 1000 word on the homepage? Do 2000 on yours."

    I'm not sure if content length has ever been related to indexing in all honesty, there is plenty of pages out there with very little content on them that are indexed and ranking for stuff but to be safe I try get my backlinks on pages with at least 500 words, usuallly around the 800 word mark.

    Back to the PBN test site I have for an example, it is going up against massive tech authority sites, two of them have articles over 5000 words. My article is 1000 words, if that. In my opinion on page is all about having the keyword in the title, header and alt text and having a keyword/keyphrase density around 1% but again I don't see how optimising this can help with indexing.

    "And dont target one keyword. Ive found that the top guys has a lot of content, everywhere. Go to semrush, see what kws they rank for, for each of their pages. Use all of those kws on your pages aswell. Topic relevance is key."

    If the page is not indexed then it doesnt matter if you target 1 or 100 keywords.

    "One major factor Ive found has worked very well slso is update your pages once a week."

    The very first site I ranked with SER before penguin and panda is still indexed and ranked and I have never updated its content and its been at least 3 years. I have stuff pulling traffic pretty much from then till now that has never been updated since it was posted.

    I did have a theory similar to this a few weeks ago but it is inpractical, this thread is about indexing automated backlinks. If I set a server to build nothing but contextuals from SER it can push about 200,000-300,000 in a day. Yea SER could have something added to go back to try log into them all and change a paragraph or something but I decided it would be impractical so dropped it.

    "dont forget to use a plugin so all of your blog posts link to eachother. Internlinking is huge and crucial for transfer juice all over."

    I have never tried siloing a site but this goes 100% against its logic, the internal linking would just be a mess with unrelated pages linking to each other, on top of that it dilutes any link juice pages do have by passing it to ones that don't.

    "And heres the golden nugget: start using 2016 and/or 2017 in your backlinks. Try and you'll notice what happens"

    Santos told me about this a few weeks or months ago, I tried it and it never helped get SER stuff indexed. I see the logic of it helping a page that is already indexed and ranked get more traffic from people who see it in the SERPs as it appears to be more recent even if you literally just changed 2015 to 2016 in the title but it never helped with indexing in my tests.

    Your post just reads like something Niel Patel or Rand Fishkin would post thats based on theory and Googles guidelines rather than stuff that you have tested and proven to work.


  • edited September 2016
    Agree with most of that @shaun placeholder content is what I do. I slap some shit up and once it ranks, with good on-page is the key, then I replace it or add to it. 

    I have a few pages on my 11 page finance site that are 2000+. It did seem that anything over 5k is too much and just doesn't index. 1500-2000 is my preference, as you can lay it out like a wiki almost, with jump tags and headers to explain, fulfil, solve and answer most people's questions about the topic = which in turn, naturally makes your page (not site, but page) the authority for those keywords. 

    The 2016/17 thing is so so so so so so 2011 it's not even funny. Just have a date stamp, or your sitemap will and that's all that is needed - for SER links it does make sense but you simply ramp up your tiers to index, go big and then bigger again. I see stupid results in Google when genuinely looking for stuff, and the msot relevant content is from 2011 (2007 was one just tonight) and it completely wasn't what I was looking for. Google isn't that flipping great, really and their PPC ads are shite too. Play the game, game the player.

    I have a one-page HTML affiliate site that is shit and has hidden content - it's just basically an image that links to an Amazon product, and it ranks like a bitch and has done since 2012. I've added 10 new pages of 500w article forge shit content and only 7 have been indexed in the last 2 months but I'm ranking 1,2 and 9 for the keywords. It cost £20 to get someone to make and has been penalised before, for spam not thin content, and has 0TF but it's ranking and earning. Solve that one? I can't but don't care as long as ££ is coming in.

    With ecommerce or large sites, running SER on the sitemaps or pushing the sitemaps through indexers, does help. I have seen my sitemap pages ranking for keywords on page 6 before, which again makes no sense. 

    All of my affiliate sites use GA and WMT. I have only seen penalties on test sites, which were on purpose, or two PBN domains that I had to check as I couldn't get them indexed but they were flagged for scraped content - which was article forge not doing its job.

    To get more on topic, indexing. SPEEEEEEEEEEEEED is the most important factor I think. If you have a bloated WP site with 100 plugins and it's over 2-3 seconds then you're fucked. If you have an HTML site that is on super-fast servers Cloudflare, for instance, then you're probably gonna hit 0.75 load times. Stupid big images and videos, you really need to go down the paid traffic or social route to gain longer dwell/less bounce times which negates the speed factor. One of my clients has a Magento that is 0.6 seconds load time, and outranks big big sites with ease. Packed full of descriptions instead of scraped content for his ecommerce site and a handful of blogs and 3 clicks to buy makes it one of the best sites in my opinion.

    Speed in every sense, relevant content, no bloat or shit, decent(ish) content and excellent on-page (SEOQuake and WooRank check them) and you should index. If not naturally, then push sitemaps, build some decent big links, press releases for new sites, then go mental. I can't comment on SER indexing rates as I've only just started reusing it.

  • indexing or not, my pages will rank the same as far as SER goes. just because google doesn't index it doesnt mean its not helping you rank. i've taken thousands of links that were unindexed and gotten them indexed and tested this several times and it did nothing for my rankings on my ser projects.

    for quality premium blogs t1s or money site, i would care more about indexing.
  • @kijix84 I never used to check indexing rates as I had too many projects and too many licenses of SER - so I was in a similar mindset to what you mentioned; just blast away and keep on blasting, as some/most will index but there will be degradation of links, so do some tidying up and keep on blasting. 

    People worry too much, just set off the fireworks and something will happen :)
  • Tim89Tim89 www.expressindexer.solutions
    Indeed, +1 @JudderMan an action causes a reaction.
Sign In or Register to comment.