Skip to content

My Tiered & SEO Strategy - am I on the right path?

2»

Comments

  • edited February 2014
    Bravo to @ron =D> great post
  • cool post @ron but i can't see how i can be agresive and not droped in rankings :( I am runinng couple of campains, all are on blogspot.com blogs, and after i rank pretty well (lets say nr. 3-8 spot) i dropped to nowhere to be found and get back eventualy to ~40 spot. I am using 3 tiers. On T1 i use Articles, social bookmark, social network. Maybe some tips on how to optimiese engines for contextual links? I uncheck making any profiles but SER is making some anyway. I am new, please advice :)
  • @LeadK read more and more and more on here on how to set up your SER. I couldn't rank any blogspots/bloggers. Deleted the content, slapped it on a domain I wasn't using and it's ranking like mad......think about who owns blogger....
  • @JudderMan yes but i can see bloggers on top spot with just ONE anchor, 1k+ backlinks, yet mine getting penalty or whatever it gets with lots of anchors set up and generics/domains, couple of links daily (20+-5 submisions), same on tiers below (only with per link setup) where is the logic? :(
  • edited February 2014
    I have been able to make the links at great speed now 200k per day or so but non are indexing thus they are worthless. So I have to plug this hole up. Ron do you simply use indexing services on t2? and if so what is the average index rate cos we know the junk indexes at a worse rate than contextual? -i know you gave me a link recently but cant recall which thread it was in.
  • ronron SERLists.com

    I have everything going into the indexer only because they came down in pricing and now have large daily limits. Otherwise I used to spend some insane amount previously like $130 to index 10,000 links. So I only used it for T1 in the old days because it was so expensive.

    Now everything goes into the indexer API because it is cheap to do so.

    There are no shortcuts to understand your indexing issues. You have to make the time to analyze these things. Take a fresh set of links on any level, weed out the dead links, run it through an indexing check, and take the non-indexed to use for the test. Stick those in an indexer, and then check indexing each day for about 3 days. Then you will understand your issues.

  • goonergooner SERLists.com
    Nice posts from the grand master @ron
  • There's so much information here and I appreciate each and every one of you for commenting and discussing your thoughts and ideas.

    Blasting my Web 2.0's is really increasing my backlink count over just a few days - probably an obvious statement that I should've known to do before but I'm happy about this!!

    Thank you all!
  • @ron Im doing that already :) whether im pleased with the results is another matter :P but im playing around with an indexer at the moment and things could be looking up but too early to start doing a jig saying anything conclusively.
  • ronron SERLists.com

    @PeterParker - We all need to verify things for ourselves. It's a pretty important mindset to not take everything you hear in forums to be the truth. I see people running around with false information gotten from another noob. It happens a lot.

    I can assure you that indexing is one of the 'critical' functions of SEO. It is important to understand that some properties index well while others do not. It is also important to understand that it is getting more difficult to index crap properties. So you need to do every trick in the book to make it happen. And if you have issues on the indexing end, try to 'systematically' tear things apart. I wasn't that thorough in my approach, but it helped me gain an insight into what the hell I was doing. I really didn't have a great plan until I did that. 


  • @ron u are master bro :)
    im reading your every words. Thank you for strategies and tips.
  • Fight the propanda , the culture of fear, by taking action ! take risks , get hit and rise again !
  • AlexRAlexR Cape Town
    @ron - I'm just reviewing your overview here and it looks great. For your "Ring Sites" or "T1" sites, what kind of links are you throwing at these? Since it seems even if you throw SER at them they will get penalised and lose their value...never ending spiral which I'm trying to avoid.
  • ronron SERLists.com

    @AlexR - You are back! I have found that using SER works great for that. They can take a pounding, and they react well to lesser quality links.

    However, I know plenty of people that will build a buffer around their buffer. I guess you can take that logic to infinity. I just don't view it as a threat for precisely the reason that if something was not working out, you could disconnect the links on those buffers.

    Since the goal isn't to rank the web2's (at least it is not my goal), I only care about the conveyance of link juice from lower tiers through that buffer. It has worked well for me, and through all of the penalties of the last two years, some web2's rank in the first 5 pages of G - with nothing but SER links. 

    Thanked by 1juanch7x
  • Hey @ron, I have a quick question for you if you don't mind. Do you use "verifications reached in a day" at all? The more I'm reading through these forums, the less I'm seeing people use this. Everyone seems to be going for "submissions reached in a day", but I don't understand why. I currently have 2 tiers pointing to a Blogspot. I use an Indexing service to index the second tier. Should I set both the first and second tier to "submissions in a day" instead of "verifications in a day"? Thanks again.
  • ronron SERLists.com
    edited March 2014

    @Enzo - Yes, only use submission per day. The reasoning is very simple.

    You are using a multithreaded application that is not really verifying each link as you create it. It checks verification later in the day. You will consistently overshoot your target limit by a mile if you use that setting.

    That's why you need to stick with submitted per day. SER can count! And there is no verification needed. So just estimate (roughly) your verification rate per 10 submitted, know how many actual verified links you want to build, and work out the math. It is that simple. And it works! 

  • AlexRAlexR Cape Town
    @Ron - Yes, back. Been busy with 2 or three other big projects. I'm not referring to Web2's but actual private net sites. If you throw SER links at them they seem to tank and then lose their value. What kind of links are you adding to these buffer sites?
  • Thanks @ron, I have one more newb question if you don't mind...how do I actually check my verification rate? Is there some sort of built in function in SER, or do I need to use Scrapebox?
  • ronron SERLists.com

    @Enzo - You just get a feel for a few projects and then you can say "I submitted 100 today, and my verified increased by 20. So I know verified/submitted = 20%. So in order to get 50 on this project, I need to set submitted per day to 250." And then do that for every project.

    @AlexR - I would probably fire just contextuals, and skip all the other junk. Put a filter of PR1+ and let it fly. Other than that, you may have to go outside SER to hit those things with other types of links.

  • @ron what indexing service do you use?
  • ronron SERLists.com
    As a team we use two services, Incredible Indexer and Instant Link Indexer. 
  • AlexRAlexR Cape Town
    @ron - interesting about a PR1 filter. If I recall this was a waste before. What's changed?
  • ronron SERLists.com
    Just trying to sort out the crap sites. Yeah, back when we started Alex, SER was a lot less efficient at finding them. But it is a lot better now. So it can handle it. 
Sign In or Register to comment.