Skip to content

How are these churn and burn results possible

I have been running a campaign for over a month now, non stop links, there has been about 96K links built. Not even a blimp onto google. none of the keywords ever popped up, ever. How the fuck are other sites ranking with like 10 links and I dont even get in top 300.
«1

Comments

  • Maybe the backlinks wasn't build quickly enough, too few backlinks might have been indexed by google, or you might not have been aggressive enough with your anchors. That's at least what I've experienced to either make or break a churn and burn site.
  • Tim89Tim89 www.expressindexer.solutions
    96k links aren't even close to achieve the churn and burn affect, you need to be building that volume within days as only a fraction of those links will become naturally indexed.

    Are you using an indexing service and are they getting indexed?
  • spunko2010spunko2010 Isle of Man
    So many variables as usual it's impossible for anyone to tell. What indexers are you using, how competitive does Moz say the kws are, what engines are you submitting to, etc. If you are submitting to only Blog Comments with no indexing then nobody can be surprised.
  • @tsaimllc some sites just never make it, no matter what you throw at them.  I wouldn't give up hope though - One site of mine I'd given up on long ago has just last week popped up for 5 of it's KWs on page 2 or 3 and I haven't done anything with it (i.e. no links) since Sept last year...
  • edited April 2014
    Learn SEO...... it's not all about how many links you have.  Like spunko2010 said, there are many other variables including where links are coming from.

    Think relevance......
  • Tim89Tim89 www.expressindexer.solutions
    if links are crappy, you need volume.
  • @01971 you might have missed the part about him doing churn and burn. But in my opinion it is pretty much only how many links you have. Of course velocity, anchors, and such are also important factors, but when I churn and burn my goal is to build as many backlinks as quickly as possible.
  • The quality of backlinks also matters. Dofollow is better than nofollow, Contextual articles on decent PR web 2.0 sites backed up by 2nd tier juice vs. one thousand spammy comments on a wp blog with 5k OBLs, varying and absent anchor text and all that stuff.
  • sagarpatilsagarpatil 1LinkList Ninja
    I started a churn and burn campaign for UK Payday Loans nice, 2 days later one of the keywords is at #80.  :)>-

    I'm targeting around 60 keywords. 7-8 mains keywords, rest of them long tail. Make sure you run verified lists.

    Let me know if you have specific questions.
  • @tim89 - don't you use any indexing services and just let google find those links on its own ?

    @sagarpatil - do you use all guestbooks/forums/comments and others non-contextuals or just those that are shown in stats as best performing ? I guess most of those non-contextuals are built but ser cannot verify them as many of them are on moving pages.
  • @tim89 I agree with you but I have it set to build 24 hrs a day I dont understand how some people get like 100K links in a damn day I have GSA set 24/hrs per day and never get that many. I bought that keyword scrape list i think it was bs and doesnt do anything, I never bought verified lists but maybe I need to start. I just let it build whatever it can no filters. I do use incredible indexer but dont think it is working good, I should have at least double that amount of backlinks indexed, at least I am guessing this based off of the number the project shows as verified URLS.
  • Churn 'n burn can be a hit and miss. Sometimes you get properties that takes weeks to pop up and other times you show up within the first 3 pages within days.  As long you keep building out properties, you're bound to hit a few homeruns.
  • edited April 2014
    I would bet that its indexing is the issue. Indexing is the main factor in any campaing these days id say, churn and burn or not. The worse your indexing is the faster youd have to build links to take up the slack with natural indexing. OR somehow find which domains are the ones that index naturally and only hit them.
  • edited April 2014
    @tsaimllc you have to follow the advice for tweaking SER on this forum. You can easily get 200lpm steady speed. It will takes a couple of month of tweaking learning specifically what does it tho. That will bring you up to a cool 288000 links per day. but dont get your hopes up evne when i was running at that speed it didnt even help my rankings :( as i said before if you cant index those links they are worthless hence why im not focusing on lpm now.
  • @peterparker please feel free to point me in the right direction to get started I thought I have tweaked, I sometimes get about 100-140lpm but that is with lots of projects running, not the same sites
  • well there you go, close enough, there's no secret setting.
  • Tim89Tim89 www.expressindexer.solutions
    edited April 2014
    @RayBan I use my own service: https://forum.gsa-online.de/discussion/6228/the-best-indexing-service-value-for-money/p1

    @tsaimllc You're better off scraping your own lists using either scrapebox or gscraper and just let SER post, dont select search engines, feed your scraped lists to SER, after time, you'll have a big enough list to rank for anything, oh and yes, you need to get your links indexed.

    If you have crappy links you need to be building 100 links to every tier 1 link, Two tiers should suffice for churn and burn quite easily, so I would aim to build 50 - 100k tier 1 links and then boosting these tier 1's with as many links as you can, building yout tier 1 and 2 simultaneously obviously, then the 2nd factor is getting these links indexed ASAP.

    Oh and I'm observing a 200 - 350 lpm  (1200 - 1500 threads) - contextual link types only.
  • grax1grax1 Professional SEO, UK | White Label SEO Provider
    @tsaimllc it's normal that you are not able to get so many verified lists a day and it's not because of the lack of knowledge, this too but keep in mind that @tim89 uses serious beast for SER (I've seen your set up somewhere on the forum, quite impressive), running on more than 1000 threads, additionally using scraping software that maybe also works on other computer. Next thing are proxies. When you use 20 semi-private then it's hard to expect similar results with those from people who uses probably at least 5x more proxies than you. So good results requires not only knowledge and testing but also a decent tools.

    I personally use VPS from solidseovps - wizard package (no advertising here, just saying so that you can check it), 50 private proxies and incredible indexer. I don't use any captcha services, only CB. I also don't use any scraping software, just SER scraping on it's own on default footprints. I'm running about 40 projects, every project has verified list checked as well as all english search engines for scraping. Every project get 500k kws list.My lpm is 30-60, depends on the verion of SER, today morning it was 38. I run 6 projects 24/7 and the rest goes on scheduler 10 projects for 20 minutes. 600 threads, html timeout 120, 40s between search queries. Ok, I think that's enough info to give you guys and overview.

    My main goal with SER is to be able tons of backlinks for various churn and burn aproaches I'd like to try. Unfortunately as for now it's not enough to do a serious blast, like min 30-40k/a day would alreadybe satisfying for me. I consider buying either scrapebox or gscraper, but I'm not quite sure if I will be able to get all this working properly ´I mean, there is learning curve everywhere, but running scraping software requiers additional proxies and additional server and I heard form a lot of users that even when they they're scraping, the number of verified from those list is still not good enough. @Tim89, if you don't mind, could you maybe give an overview of the monthly costs of running scraping software, but so that it would be possible to get this 200+ lpm you talked about ?
  • @grax1 your set up is exactly the same as mine (except I have two dedis running 550 projects between them) and my 'worries' about buying Gscraper and not getting the ROI from it are the same. 

    If I can throw a question into the mix with yours, can you run Gscraper on the same server as SER or does it need its own?
  • @Judderman,
    I ran Gscraper on my server along with SER without any problems. SER was at 300 threads and GScraper at 1500 threads. I was using Gscraper proxy service and semi-private with SER.

    This ran fine for over a month without any problems. Server is a Xeon 1230 - single proc dedi


  • @JudderMan, I've also been running Gscraper and SER for a few weeks now. It's not very long of course, but it's working out a hell of a lot better than when I was running scrapebox. I have no idea why, but at 600-800 threads in gscraper, I'm scraping 600-1000 urls/sec 24/7 while in scrapebox I'd only get 150 urls/sec at most while I had to rotate proxies every 3 hours. Still using the exactly same proxy source btw. My only problem now is to keep up with all these god damn urls. I got a little off the point there, but yeah, I don't seem to have any problems running SER and gscraper on the same server.
  • goonergooner SERLists.com
    Gscraper is much faster, especially if you use their proxy service - No doubt about that.
    But be aware that their proxy service leaks like crazy. You can't use it on SolidSEOVPS as far as i know.
  • If you guys think Gscraper is fast, you should give Hrefer a try.
  • goonergooner SERLists.com
    @justin - What proxies do you use with it? Let it scrape for public proxies i guess?
  • I scrape everyone's proxy ports and use a few different public services.
  • goonergooner SERLists.com
    Cool thanks mate.
  • Thanks all :) Time to invest in a few more goodies then.

  • grax1grax1 Professional SEO, UK | White Label SEO Provider
    thanks guys for sharing your experience! @Judderman I think it's also time for me to finally gain more experience in the area of scraping. Today I bought video tutorials from @donaldbeck, this will be my first step to get a better overview of what these 3 beasts (scrapebox, gscraper and hrefer) can do and then I will decide which one would suit best my needs. Hope it would be worth it :P Any more advice from you guys are appreciated :)
  • fakenickahl  where do you get proxies that lets you scrape 1000 urls/sec?

    Best I can do In gscraper is around 50 urls/sec

  • Honestly, @grax1 I find areas that each one of them excel in and use them all for different purposes. Gscraper is usually the fastest for list manipulation, has the classify option to organize lists based on URL/HTML criteria, Scrapebox has a lot of useful addons, Hrefer is faster than any by a long shot, is the most hands off, and has the Sieve filter which "classifies" URLs as it scrapes. What I mean by that is you can have Hrefer trash off URLs that fall outside of certain URL parameters like "/node/" or whatever.
Sign In or Register to comment.