Skip to content

GSA in wtf mode - 1200LPM / 800VPM

Screenshots says everything. 

Contextuals engines only turned on. 1200 threads.



How get such LPM/VPM:
1. Dedicated server. VPS limit your lpm. VPSes are good for some tasks, but not for SER if you are serious about link building.
2. Self made proxies.
3. Self made catchalls.
4. OS tweaks.
5. Self made lists (not obligatory).

Costs of setup i use (costs of server, catchalls, proxies) ~$100 a month.

Conslusion - you dont need multiple vpses to build ~1 milion links a day :)


  • shaunshaun
    Nice, What OS tweaks are you doing?

    Also, if you have been pulling 1218 LPM for over two and a half hours, how come you have only sent 3189 Captchas for solving?
  • Are you hitting the same sites or same accounts? Little bit more info please :)

  • Yeah, this isn't massively impressive if you have like 100000000 links on the same website. Can you show us your verified stats?
  • shaunshaun
    Personally, I think it's a massive achievement getting SER up to 1218LPM, even if all the links are from the same contextual website. He has the method to get SER that high, next all he would have to work on is growing his verified list.

    Judging by the information available his target list has loaded 3673 and 4567 targets. If they are contextual platforms it's pretty safe to assume that he has only one target per domain so depending on project settings could put him at around 8200 target domains from those two loads alone. You can only see a snap shot of his verified in the screenshot but it looks like a solid percentage of Do Follow too.

    The thing that's making me suspicious is the edits to the images, the blank fields where there should be data and the incorrect captcha sends to his captcha service. 

    Even without the edits to the image it's easy to fake this, all you have to do is let SER run for a day or so to build up your submitted and verified URLs in the second and third column, stop SER, close SER and reopen it. The second and third column data will remain but the rest will wipe. Right click your projects, clear URL cache then load from site list on a clean verified list and press start. The initial surge of LPM and VPM will be crazy high for a few minutes and would explain why the captcha requests are so low.

  • This is the same guy that claimed he could get gscraper up to 100,000 urls per minute per instance. Not impossible but not easy when it was broken.
  • HinkysHinkys - Catchalls for SER - 30 Day Free Trial
    To be fair, 100,000 urls / minute isn't that hard on a dedi, I'm doing 1500 / s (90,000 / min) with Scrapebox on a $30 VPS all day long without any crazy optimizations. It maxes out the CPU tho.
  • @Hinkys

    Yeah, but this was when gscraper was broken still and everyone was experiencing like 500/min. Not scrapebox. 
  • @satyr854. OS tweaks. can you explain
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Don't you all just love posts like this? Show some crazy stats but does not tell what to do to achieve them...
  • HinkysHinkys - Catchalls for SER - 30 Day Free Trial
    Oh, didn't get that part. While I don't know the details, usually when something is "broken", it's those that are using it rather than the software itself.

    But anyway, dunno why you're all bashing the guy. Yeah he's probably posting blog comments but it's still some nice stats. Plenty of CPU left to scrape wildly & identify in the background. Wouldn't be this much left if you were posting on article sites tho.

    As far as OS tweak goes, unless he's some networking expert (and he's not), it's probably this (I'm yet to see / find a better optimization guide):
  • spunko2010spunko2010 Isle of Man
    edited April 2016
    Self made proxies alone would cost more than $100 a month....? You would need loads of dynamic IPs.
  • Very impressive lpm&vpm indeed - but the real question should be what you're actually ranking with those K2's. I'm afraid it won't look as pretty as these stats ;-)
  • @rogerke

    K2 are basically worthless and/or SEO poison. I ran some test on a batch of K2 that I put out, close to 10% were hacked sites, some of them had malware warnings, etc. Some of them had no anchor text at all, etc. 
  • @BigGulpsHuhWelp

    I definitely agree and I'm not suprised at your observations of a significant percentage of these sites being hacked. There's likely a correlation between non-moderation (and thus "allowing" this kind of spam) and not updating security features. 

    To be fair to @satyr85 though, I'm pretty certain I know the niches and markets he's involved in and even though it's not the strategy most would use or even come up with, it's definitely a profitable one when scaled. If correct I definitely see why he's happy with those lpm stats.

    Also, SEO expands beyond Google :)
  • shaunshaun
    So yea, the OP has been online a few times since making this thread with no further input so i'm going to call bullshit on his claims.

    I'm 90% sure he did exactly what I described earlier in the thread to fake the screenshots.
  • I dont understand why you hiding the cpu, a client of mine on a 8 core vps X5650 Xeon cpu hit 3000lpm per minute, why hide the cpu?
  • @Hinkys, care to explain how you are doing 100k/min with scrapebox? Are you using a ton of private proxies??

    I thought I was doing ok doing 50/s with my public proxies using google api..
  • You dont want to use public proxies for SB. Having good internet and dedicated proxies goes a long way. Plus hes probably running a couple instances, because you can duplicate as much as you want for free.
  • feukimfeukim Singapore
    ouch God, 
    i got wrong thread post, is OP Ok?
    not really sure about this, but still possible. 
    he said about serious link building? 
    need some aspirin. :D
    whats wrong with your images about CPU statistic?

  • googlealchemistgooglealchemist Anywhere I want
    What is a 'self made proxy'?

    What are your OS tweaks?

    Where do you find a good dedi for that low cost and whats the bulletproof status?
  • You see @shaun, your last post is answer why i didnt replied earlier. I just wanted to show whats possible with SER, and show possible bottlenecks (proxies, emails, vpses etc). If something is hardly doable, this dont mean its bullshit. I have no reason to put fake screenshots and i dont have time to discuss if these screenshots are fake or not.

    High XXX lpm/vpm is possible with SER on contextual engines but its mass linking, usefull on lower tiers and only if you have own indexing method. All these K2s and other engines wont index - thats main problem. Without $xxx a month on indexing or own indexer with ability to process xxx k links a day with at least 10% index ratio such mass link building with SER dont have sense.

    Yea i was able to get high lpm in Gscraper when it was partially broken, than it was (and it still is) fully broken and noone was able to pass 10k a minute barrier. 

    I use windows 2012r2 and some of these settings are not good for 2012r2 but its good guide to start. Such tcip tweaks can put your dedi offline and it wont be accessible without KVM so be sure you know what you are doing before you play with these settings.

    Yes, same websites, multiple posts per account.

    Generally you dont need alot proxies to post with SER if you dont target recaptcha sites. "You need alot of proxies" is myth and there is big business around this myth.

    Self made proxy = buy server/vps, setup proxy script - squid, tinyproxy, privoxy, dante socks or something similar and you have proxy.

    You dont need bulletproof server if you run GSA with proxies. is good - its very whitehat datacenter (so proxies are obligatory) but hardware is cheap, support is fast and for ~$50 a month you can get 300mbps server - enough for SER + SB.

    Hinkys is probably not scraping google and thats how he get such high lpm. High lpm on google these days is nearly not doable. 

    Btw - many depends on SER version. Not all SER versions from my experience can run with high lpm/vpm. Idk why, thats just my observation.
  • I remember the days of 100,000 gscraper. I think it was bait switch. Did the 7 day trial got 100k, then few months later bought a gscraoer proxies for 2 months, forgot to cancel, could never get that high again
Sign In or Register to comment.