Thats correct, someone should do an A B test where they run a project with JUST K2 posts, and a similar project with all other articles, or even just a single comparable contextual engine, and see how the links differ in Ahrefs or something.
Ahrefs' data would be totally irrelevant. If you want to find methods to rank you can do some split testing yourself and wait for months with unpredictable results or you just scrape a huge list of domains from others in your own verified lists, bulk check them in Semrush and BOOM you can reverse-engineer methods within a day.
Although I hate to say it, this way I've found pretty much no one actually ranking with SER solely. All the current succes stories are pretty much made up as far as I'm concerned. There's one exception with a guy ranking the entire page 1 in the buy **** followers niche, but he used custom engines. Feel free to prove me wrong though.
@BigGulpsHuhWelp I only use a select few platforms in all honesty.. I'd rather set a campaign up to repost multiple times per account on a platform I know works well, than use platforms I'm a bit sketchy about.
I scrape all my own targets, I have PI which makes verifying my raw lists simple and I only scrape the targets I use which makes things easier too, as there aren't many I use.
I would like to add to this discussion. Around the 1st of October I had 4 sites get a penalty (-60). I did not receive any WMT messages so it was a algo penalty. I guess I should also add it was only keyword/page specific like penguin, not site wide. Those particular sites I used GSA for Tier 1 contextuals mostly consisting of K2. I also used PR Jacker along with very strict rules and limited daily posts. All links anchors were Branding or Generic or URL for dilution. I reserved money and LSI keywords for my PBN links. It was also not a panda penalty as all sites were hand written unique quality content of 2000+ words. Other sites that I have that I never used GSA/K2 on rankings actually increased around that time.
Sure. For blog comments and guestbook posts etc. it can be fully automized. Just get your verified list, scrape the external links on the page with Xenu, SB and remove all non-root domains (this pretty much removes all links linking to lower tiers and leaves the money sites untouched). Then you extract your favorite tld's and check bulk check them in Semrush (you can directly do this with URL Profiler) and see which ones are getting any organic traffic.
For contextuals it's a little more work because all links are on different pages, but with some manual work you can find 100's of domains within a couple of hours.
Once you got domains with significant organic traffic you check them in your favorite backlink checker and reverse-engineer which engines and/or tier setups they're using to rank.
@mikey08 we know that is not true though based on the indexation rate of wordpress sites compared to k2 sites. wordpress sites index really well and really quickly. k2 sites index at a rate of like 10% for me.
After reading all this i run a test with a small verified list on my Tier 2.. Here are results
I run the list with DBC and a monthly captcha solving service separately.
With DBC
With Monthly captcha solver
So, My question is if the Good platforms need premium captcha solving then how to find verified domains them from a raw list. Obviously not going to use DBC on raw list.
@londonseo - I don't think that's what he's saying at all. Out of curiosity, are you able to index K2 links created by SER? Are you able to rank with these links?
@londonseo - that's great to hear. Personally, my experience matches what's discussed in this thread, and so I've stopped using K2 entirely.
Just because Google is indexing a domain doesn't mean that it's indexing new pages / posts on that domain. A few weeks ago I was going through some of my verified links and manually checked a couple of domains using XpressEngine. These domains, which had thousands and thousands of posts created with SER, only had 10-50 pages indexed. I don't know if this is more widespread with K2, in fact I have no idea what the problem is, but the indexing rates are awful.
Comments
Broken tiered link building.
I scrape all my own targets, I have PI which makes verifying my raw lists simple and I only scrape the targets I use which makes things easier too, as there aren't many I use.
Just because Google is indexing a domain doesn't mean that it's indexing new pages / posts on that domain. A few weeks ago I was going through some of my verified links and manually checked a couple of domains using XpressEngine. These domains, which had thousands and thousands of posts created with SER, only had 10-50 pages indexed. I don't know if this is more widespread with K2, in fact I have no idea what the problem is, but the indexing rates are awful.