Skip to content

200LPM Consistently! My Settings Included

edited June 2014 in Other / Off Topic
(Edit - 5pm it's at 280LPM and 200k links for now, running Active V for an hour)

Finally! I have found a setup that gives me an average of 200LPM consistently, even across SER updates (normally it used to be 100LPM then would drop to 0.04LPM after updating and not changing anything). I figured out that some of the settings/filters change when you update. Remember, it gives you a warning that any unsaved work will be lost before updating....so, as soon as you see the green update box, backup your SER, update then restore your project files.

Over the past two days I've been getting 180-280LPM, and this is letting SER scrape!!! I have loaded my own scraped lists previously but they gave <0.01% verified results, so this is all down to SER really.

Settings
OK I have a very quick server (32GB, octo-core, 1GB uplink) and know it's overkill but I reckon you could achieve the same results on an SSD VPS or normal VPS from SolidSEO. I will be buying one of these soon for another copy of SER but only for posting. I do use my server for other things too so it's worth it for me, and it's not much more than the fastest SSD VPS anyway. Thanks to @gooner for helping with settings.

- HTML timeout 180 seconds
- Time between searches 40 seconds (I used to run at 5-12 seconds previously...not good)
- 40 Semi-dedicated proxies (I need to buy more, I know. I did cancel my private proxies are they weren't good enough ProxyHub are crap these days)
- Never verify (I run Active V once per day for a few hours as I can run 1000 threads and 150 projects at the same time)
- 700 Threads
- CB and EVE (in that order)
- 450 projects (usually 3-7 tiers/projects per URL)
- Some spam projects to boost all verified links of all T1A+ links
- Some high PR projects to boost all verified links of all T1 links Global Only (run some high PR projects for a few weeks to build up a base)
- Bad word list 450 words (search for them online)
- Do not check continuously try and post
- Check post to same URL with x amount of time between posts
- Try and put anchor in comments/descriptions
- 100 OBL (not sure on this as being a good idea as I'm more inclined to just get more links)
- Use global lists but only submitted and verified.
- 20/20 Scheduler - so you don't NEED lots of projects as I only run 20 at a time. Any more and I find that I get N/A on some of the active projects.

In a week or so, I'll compare the rankings of my sites and post back. Hopefully more LPM means higher rankings (in my case, not everyone's). I'm sure when I learn how to scrape better then I should see stupid LPM but I wanted to nail the right settings with SER doing everything first. That way I can run one copy of SER doing everything and one for posting lists from GScraper/Scrapebox.

«1

Comments

  • goonergooner SERLists.com
    Nice work mate, looks like you've cracked it and yea good lists will add another 200 LPM on that i'm sure.
  • I think @Sven deserves a medal too, his tweaks have helped reduce the memory issues I've been having.

    Lists are next week's focus ;)

    There was an issue with Incredible Indexer too so I had to manually dump 100k the other day and saw stupid increases in rankings today.
  • BrandonBrandon Reputation Management Pro
    I think the biggest thing you're seeing is 450 projects and using submitted and verified site lists...that means if 1 project finds a URL, you'll submit to that URL 450 times. Once it's verified, you'll submit that URL another 450 times.

    The LPM is good because you're running the same URLs through 450 projects, twice (submitted and verified).

    Just my thoughts, not trying to be negative as I always appreciate some tips that make me think.
  • edited February 2014
    Hmm, when exactly does it use global lists? In between searches? I have it searching AND using global lists. Will run it tomorrow with global lists unchecked. 
  • Even I had noticed reduction in Cpu consumption when one of my project went into verify mode . Was too lazy to take a action on it . Doing this now . Thanks . 

    And whoa ! 32 GB RAM . I am on 2gb vps 
    ^:)^
  • BrandonBrandon Reputation Management Pro
    @Judderman, it is supposed to use imported URLs, then global lists, then searching. Searching is the last option when it runs out of urls on the server.
  • goonergooner SERLists.com
    I think @brandon is right, it's almost impossible to get 200 LPM with SER scraping and no sitelists.
    But possible to get higher than 200 LPM with SER not scraping at all, using only imported and global sitelists.
    @judderman is somewhere in the middle of that.
  • @JudderMan, nice share, glad to hear that you're starting to get some nice results.


  • Cheers guys, will take on board the input and tweak/test. Technically, my global list is 'my' network then ;) I might reduce the OBL on the T1s to force SER to use better links.

    As always the rankings are the most important bit so I'll keep a close eye on them over the coming days/weeks.
  • @Judderman - how you verify projects once per day? You just clicking at "stop" sign and changing all projects to "verify" and wait few hours?
  • @Judderman you said:
    "- Check post to same URL with x amount of time between posts"

    This is this option?
    image

    What ammount of time you giving in minutes?


  • @micha yep just stop all projects and run Active V for an hour or so. It should be at least 12 hours since you started submitting though as it usually gives enough time for the links to be accepted. I usually wait 30 minutes between posting on the same site.
  • edited February 2014
    Nice one @JudderMan, congrats on these insane stats...and thanks for sharing your settings.

    Looking forward to your results with global lists unchecked.
  • Without global it's running at 75-120LPM but I'm getting proxy's failing. Definitely time to get rid of proxyhub.
  • 75-120lpm is still awesome considering SER is scraping. Even with global lists I only get around 120-150lpm, lol.
  • Great. What's your verified ratio?
  • I'll leave it running without global lists all weekend and see what it is after a few days running. 

    @Pratik - As I verify them myself using Active V for an hour a day, I usually get 50-80k verified from 200+k verified - I have Incredible Indexer so it can take 100k a day, so I'll try and get my money's worth out of it. I'll be able to update you on Monday what the figures are with no global lists, no point in saying a figure now with only a few hours of running. 
  • That ratio is incredible! Nice work there.
  • Don't forget that many might be the same site though. I'll run my verified list through Scrapebox and see how many unique domains they are.
  • I have 60 proxies and had "Time between searches" set to 15s. I put it up to 40s and saw an immediate, slight increase in lpm. lpm isn't that crucial to me but an increase in efficiency is always welcome. Thanks for the write up.
  • ronron SERLists.com

    Good job @JudderMan. And even more impressive since you only joined a few months ago.

    I'm sure some of that LPM came from footprints. I did this exercise a year ago and it makes a difference, but a quite a bit of work. And then Sven changes the script in an engine to make it work better, and then you need to update the modified engine. So it never stops, lol.

    What I found was that when I deleted duplicate domains - drumroll - 95% were duplicate. So be prepared to stain your shorts haha. Then you stare at that information, and you begin to change the game plan.

    There is nothing absurd with 95% duplicates. You start with 1 link in T1, which becomes 10 links in T2 and 100 links in T3. In theory you should only have less than 1% unique domains by that math. The only reason it made it to 5% is because the T3 could never manufacture 100 links for every link in T2. Hell, I'm lucky to be twice my T2 on the T3 tier, lol.



  • Thanks @Ron - I used Santos's invaluable footprint tool and checking the lists it really is making a difference - for instance, Drupal blog 55 Drupal blog modified 1127 -  I've only done the article directories and I still have lots of more footprints to get through too so results should be even better.

    LOL at the duplicates, I knew that checking the post to the same domain box would give an increase in verified/LPM but if as long as the rankings are going up then I'm happy. I know I will need to diversify and be constantly tweaking to not only keep Google guessing.
  • @Judderman, you are scrapping links by SER, or you just import list?
    This footprints helps too if you importing list, not scraping from google?
  • I've been getting 100-150LPM over the weekend running on no global lists - that equates to 40k verified from around 200k submitted. However, this could be skewed from previous submitted/verified from previous settings. I really need to modify the rest of the engines as I think this is where the real results comes from.

    I've trimmed my total projects back to 265 from 450 for various reasons ie. wasted resources, can't see t3+ being that effective (I may add them again if this isn't true) and want to give more focus to the other tiers. Running 20/20 on the scheduler means there is too much of a gap between projects getting some juice each day.

    @micha scraping no list. I think the footprints would only be useful for scraping but I can see that it 'might' be useful for imported lists/identifying them...not sure though.
  • Hey JudderMan;

    how do you setup your tiered campaigns?
    I mean what tier structure do you use for each project and what backlink types or platforms are you using in each tier?

    Best regards,
    Daniel
  • Just to update this:

    I'm running only contextuals, no posting on the same domain, no global lists (search only), no continuously try to post, and it's at 99LPM. 

    I've trimmed my total projects down to 230 for now but this will increase back to around 400 soon as I've just finished off my e-commerce site, and I have two new e-commerce clients coming soon so it will be around 500 projects in total by next month. As soon as my e-commerce site starts making money I will buy another copy of SER to run my own sites on and have a separate one for client sites. Once I get some more skillzzz in scraping lists I'll switch it over and run all projects on one SER and create verified lists on the other.

    My current scraping results are terrible...1500 verified out of 2million. On and off it's taken a few weeks to clear through the list. At least there's room for improvement :)
  • goonergooner SERLists.com
    It looks like you're heading in the right direction, well done fella
  • Good stuff Judderman :)
  • sagarpatilsagarpatil 1LinkList Ninja
    in How to get Target URL's, Search Engines to use -

    Do you select all the search engines?

  • @sagarpatil no mate, just use 5-10. I only use UK ones for my UK sites and US ones for everything else. You'll get too many duplicates if you use too many.
Sign In or Register to comment.