Skip to content

Fast Niches - 'Churn and Burn' Techniques?

Hi,

I am currently working with certain 'niches' which will be very profitable for a few weeks and then drop to almost worthless after a couple of months. What I am after is a way to exist on the first page of Google for a week to cash in, then move onto the next one. My research has brought me into 'churn and burn' SEO, however the information I have found is fairly old.

A strategy I know of involves buying an old domain with a PR above 0, 301 redirect it to the money site, spam the hell out of it with backlinks, send 'likes', then it should obtain first page glory.

My question is, does this kind of ranking still work? Is there a better way of doing things to get to the top for a short amount of time?
I aim to have several pages for several different keywords at the top at the same time.

Any recommendations or tips would be greatly appreciated,

Many thanks,
toasterman
«134

Comments

  • Yes, that kind of ranking still works. Someone posted their journey on BHW of just spamming their site with backlinks. If it's okay to link to other forums, I will post a link to their thread.

    He did something similar to what you're proposing. At first he spammed to his money site, which I believe was an old domain. Then eventually he did a 301 redirect to a blogpost blog and saw bigger gains (blogspot blogs and some others have a reputation of being spam-friendly--just pick one that allows you to monetize it).

    I'm going to try the method. A 10+ year-old domain recently became available to me after I won it in a GoDaddy auction. Someone else mentioned it helps to have niche-related content on the domain before re-directing it. Some let it settle for a couple weeks with the niche-related content before the 301 redirect to your money site, but some don't. I even read one account of a 301 redirect being done without hosting the domain--just did it from the GoDaddy domain manager.

    So you can do the 301 method, but you should also try it without the 301 redirect by solely blasting links at the spam-friendly web 2.0 property.

  • http://www.blackhatworld.com/blackhat-seo/my-journey-discussions/625283-mass-spam-dominate-10k-day-niche-hopefully.html this is the link I assume Squidward was talking about.

    If I was the guy in the BHW thread I'd be keeping up the blasting on the 301d domain as it was ranking, he's swapped to the blogspot only now but with the 301 in place it still makes sense to bombard that domain too.


  • spunko2010spunko2010 Isle of Man
    This is very interesting, thanks for sharing @JudderMan I hadn't seen it before ^
  • @JudderMan Interesting indeed. Much appreciated.
  • Yes, @JudderMan posted the thread I was referring to. It looks like from his recent updates he has started earning commission.
  • It's a f**king awesome thread I think. Shows just how powerful SER can be and shows I still have a lot to learn with spamming. It's mad that Blogspot can take all those links and actually work....
  • i just read the whole thread, interesting stuff... i will try something like that.
  • Was deff a good read. Read the whole thing as well.  I think I found another niche that he's doing this in cause there is a blogspot page that is shooting up in the rankings pretty quickly.  And the niche is exactly what what he was talking about.  Brand new and the searches are blowing up on it as well.
  • spunko2010spunko2010 Isle of Man
    edited January 2014
    ^ I found his niche / website far too easily, people are careless with their WHOIS data. But I won't out it, will just keep watch on how it unfolds. Keep your WHOIS private people ;)
  • He's on this forum too....just saying :)
  • not trying to start anything here @spunko2010 you saying you don't want to out it...but you just did. please edit your comment for respect to that user.
  • spunko2010spunko2010 Isle of Man
    edited January 2014
    Hi @jpvr90 how did I 'out' it? I haven't mentioned anything. :S I just meant that as has happened on other SEO forums in the past, there are people that look for information and then report the URL to Google/use it for their own SEO/negative SEO/etc. Just saying it's important to be careful, s/he is trying to help others but it can backfire - yes I speak from bitter experience!
  • edited January 2014
    @spunko2010 your not telling them exactly where the treasure is at...but you sure did give them a map to find it. do you get it now?

    i don't think the user would like it very much...but its your call.
  • spunko2010spunko2010 Isle of Man
    edited January 2014
    Well, I disagree, but I have asked Sven to remove my original comment anyway. I don't want to start beef. And again (if they read here) thanks to this user for sharing their methods and results so freely, always helpful to learn a few new things.
  • goonergooner SERLists.com
    Nice thread over there. From what the guy openly says in the thread it narrows down the niche to about 3 possibles, no detective work required. So i don't think @spunko2010 has outed him.
  • Agreed @spunko2010 I'm glad he shared.

    I want to try his method, or similar to it, but at the pace I'm harvesting urls I don't think I'll be able to anytime soon. I set the connections in Scrapebox to 5 (10% of my 50 private proxies), but I'm averaging 20 urls/second.

    At the moment I'm scraping the first 1,000 words of a 100,000 keyword list with the footprints for blog comments. I can't imagine how long this will take to finish with the rest of the keyword list, plus additional lists, and to scrape with the footprints of other targets.

    Can anyone offer any solutions to increase the harvesting rate?
  • goonergooner SERLists.com
    @squidward, try the scraping video series for sale here on the forum. It'll show you some better techniques than using standard footprints.

    That will help you scrape faster, with less proxy bans because no footprint searches are used and it will increase your success rate when posting to the scraped list.
  • @Squidward You can use GSA by itself with Search enabled for whatever language you are targeting, load that keyword list into your project, and post blog comments VERY fast. No scraping necessary. Now, for Tier One level targets, such as Drupal Articles and so on, you must scrape unless you are @ron and have God Like GSA abilities. 
  • I don't think anyone is trying to out his niche. Just the problem when you are so public with that stuff and others that know what they are doing.  It can be pretty easy to figure it out.
  • Thank you @gooner Are you referring to the one by @donaldbeck ? If so I'll make the purchase now.

    @caneh34d I think I'm going to do both: give GSA SER a large list of target urls to post to, and when that runs out I'll let it scrape. Or perhaps I continuously feed it a list. I read that it's best to let GSA SER use its resources for other than scraping.

    Doesn't it burn out your proxies sooner by letting GSA SER scrape?
  • goonergooner SERLists.com
    @squidward - Yes that's the one mate.
  • Thanks @gooner

    How many urls/second are you able to scrape on average?
  • goonergooner SERLists.com
    @squidward - You're welcome. It depends on threads/proxies of course, i have 75 threads scraping 3 search engines and can get around 150 - 250 urls/sec total.

    But the impressive part is the verified % of those scrapes, i am getting x5 the total verified links from each scrape since i stopped using footprints only.
  • Nice, thanks again @gooner  I'm looking forward to viewing the series.
  • edited January 2014
    @Squidward Here's the deal with GSA SER Search. On the options tab you must set Search to Public. GSA has a built in public proxy finder. You use your private proxies for submission and verification and your public proxies for Search.

    All that being said, in a nutshell, it is better to scrape IMHO. I also also the @donalbeck video and it is VERY good. I use GScraper and can get between 1000-2500 urls/sec. Scraping is an art and I am in now way good at it yet. THe video series is a big shot in the arm for anyone needing to get up to speed fast.


  • goonergooner SERLists.com
    One thing to consider is that although Gscraper is much faster, as well as having to pay for the proxy service there are 3 drawbacks:

    1) It only scrapes Google - Which means potentially less possible targets to find.
    2) It scrapes a ton of duplicates, for example a recent 24 hour scrape produced 32 million urls, when dups removed it was just 2 million. SB can also scrape 2 million uniques in 24 hours easily.
    3)  It uses public proxies which produce poor results with any kind of footprint searches.

    I tested both and decided to go with SB because i can use the same private proxies i use for posting so save myself $66/month and even with 1 VPS dedicated purely to posting to scraped links 24 hours per day, i still find more targets than i can process.
  • edited January 2014
    @gooner I own both GScraper and Scrapebox. All I have been testing so far is GScraper. I had no idea that Scrapebox could use private proxies like those I dedicate to GSA without getting them banned quickly. I will give it a go. A little update though. I have been testing GSA Scrape and Post functionality for the last few hours. Right now I have five projects running to see if I can gain any benefit / results from this:
    • ARABIC
    • ENGLISH
    • GERMAN
    • RUSSIAN
    • SPANISH
    Each uses a large keyword list matching the language and all the GSA footprints. No PR and no OBL filter. Only badwords... 

    In four hours I have decent numbers using no global and no imported URLs. The breakdown is as follows:
    • 107 URL shortnener
    • 99 Article
    • 70 Indexer
    • 69 Blog comment
    • 43 WiKi
    • 31 SEREngines Web 2.0 Profile
    • 25 Exploit
    • 14 Pingback
    • 9 Image comment
    • 8 trackback
    I think if I can get the keywords figured out that this would be viable to run on autopilot especially in the lower tiers. At a minimum I am using it to build up my global site list on occasion.
  • goonergooner SERLists.com
    edited January 2014
    @coneh34d - I run a few projects scraping with SER too, just to help grow my verified even more. Works well.

    The reason i stopped doing all scraping with SER is that is much harsher on your proxies, You can run SB with just 50 threads and get millions of links, but SER with 50 threads would be doing really small numbers.

    This could be improved if SER could scrape a global target list and if you could set scraping/posting thread totals independently.

    But in all honestly, none of this stuff really becomes an issue until you have 100s of projects, I'm running into a lot of issues scaling up - As you know from the other thread! lol

    For the keywords, you know you can put them all in a folder and have SER choose one randomly?
  • @gooner Nice to hear someone else is growing their verified in the same way, and yes I am using a %spinfile for each language. It is beginning to work. I noticed that my English project built a good number of contextual links. I'l keep my eye on this as I move forward.

    I let SER scrape the search engines with public proxies. That process is CPU intensive, and I agree, is no long term solution to feed GSA with targets. It is nice to have though, and I will keep those projects running to build my lists.

    With regard to scaling, there will not be a quick fix for this. Multiple servers and copies of GSA SER are likely necessary. As far as the management burden of this is concerned, it is time to get creative. I know several people on this forum that use automation across multiple servers to meet demand.
Sign In or Register to comment.