Skip to content

Can't increase my LPM past 24. Can anyone help?

Hi, newbie here.  I've read a ton of the advice here (such as LeeG's and Wezz's posts).  I've decreased the search engines to only 5 (all Google) and eliminated all the non-performing engines (both where the results were under 10 and also the ones where GSA Captcha Breaker can't solve well).  And based on that advice, my LPM has increased from 1 to a high of 24+ LPM so thanks!  I'm also using around 30 proxies from Buyproxies, and the only captcha service I'm using is GSA CB (though I have Captcha Sniper also).  I'm doing Tiers 2 and 3 of 3 projects.

However, I've been reading about people getting 50+ or even 100+ LPM, and I'm like, How do they do that?!!  I mean, I've followed most of the advice already.  Can anyone help me out?  I've included my settings here.  Maybe it's my crappy Berman hosting (only 1GB); maybe I should increase to 2GB?  Anyways, here are my settings.  Thanks a ton!

image
image
image

image
image
image

Comments

  • donaldbeckdonaldbeck Advanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
    First of all, people getting really high LPM are probably using dedicated servers and not VPS's. What you are running GSA  on makes a big difference.

    If you are using private proxies, I don't think you need to check public.

    I check continuously post to a site even if it failed before. I just want gsa trying to post everywhere as much as possible.

    I'd also check use urls linking on same verified domain. I've been doing that for a long time, and it produces good results.

    Also you might want to consider using GSA as a dedicated posting program, and stop using the searching functions all together. Thats personally what I do and i get pretty good results.

    If you purely want the LPM to be high for whatever reason, then:

    -stop searching for new urls with gsa
    -import lists scraped with a dedicated scraping program
    -use global site lists that you know are verified and good to go
    -get a dedicated server
  • goonergooner SERLists.com
    I'm not that experienced with SER compared to some but i do get over 100 LPM everyday so maybe i try and help.

    I tried selecting 5 search engines too and it burnt out my proxies very quickly, so now i select all search engines and leave it as that. How does that affect LPM? I don't really know to be honest but if your proxies are blocked then you will definitely get a lower LPM.

    There is a new option in the project settings, "Use URLs linking on same verified URL" try this option on, it increased my LPM from around 70-80 up to 100-130. Worked great for me.

    You don't have "custom time to wait between search engines" checked, maybe put this at 90 - 120 to protect proxies.

    You have checked "stop projects on no active projects" in that case you should probably check "restart proxies on active proxies", that way if a project stops because of no active proxies, it should restart automatically once a good proxy is available.

    Hope that helps
  • goonergooner SERLists.com
    edited September 2013
    You may also want to remove OBL and PR settings if you checked those, that will make it go faster

    EDIT:
    As @donaldbeck said, defo uncheck public proxies, i didn't see that.

    I use VPS and can get good speeds, 4gb ram on mine and i use SER to search targets too, i haven't edited any engines or de-selected any.

    Having lots of projects really helps a lot with speed also.

    In my opinion "continuously post to a site even if it failed" will slow down LPM, not speed it up.
  • As @gooner said, more projects = faster LPM, don't set "continously post" and remove PR & OBL limits.

    I'd also look at 2 or even 4gb memory as with 30 proxies, you could push your thread count up to 300.  You need to find a balance with search engines, as using too many will end up causing "Already Parsed" messages as it find sites that have already been found by other search engines.

    You also got to remember that LPM changes with each new version of SER, and it's not something you should measure against what others achieve as all of our set-ups are different.  Yes, it's great to be hitting 150 LPM, but actually, what really matters is your ranking.


  • ronron SERLists.com

    ^^Everybody above was correct.

    - Your LPM will go through the roof with more projects.

    - Only use private proxies. Leave public proxies for people who are just starting out on a budget.

    - RAM matters. If you can afford a set-up with 4 MB, then please do it. If not, you will be fine with 2MB. When you get over 20 projects or so, start using the scheduler, and do 10 projects at a time for 20 minutes each.

    - On 30 proxies, I have it set at 300 threads, 120 sec html, 5 sec SE pause. I can easily set it at 5 seconds pause between queries because of the next point...

    - I use about 120 search engines, and I never burn out proxies. You can right click on the SE box, and select only English speaking SE's, and you will be in great shape.

  • LeeGLeeG Eating your first bourne

    LpM is not the be all and end all, verified is your target

    I can do low LpM (80 ish) and still have 50k verified and on a good day, well over 100k verified

    I personally run 200 threads, 40 shared proxies and five random googles

    That way, Im always pulling more fresh sites to post on than repeat results using more engines

    Less search engines, less repeat results. in an ideal world, I would run one search engine

    Watch your stats and look at the engines you use that give a lot of verified, then bin the ones that give poor results

     

    I did some testing recently. Upped my threads to 300. Google slaps all over the place for too many queries on both searches and recaptcha. Running 300 threads might sound good, but the end result, is a total and utter joke. Poor submission rates where the proxies were slapped silly

     

    Needless to say, I went back to my trusty settings and I still bang out daily verified like this.

    These stats are verified and not submitted

    image

     

     

     

     

  • goonergooner SERLists.com
    @ron - Nice tip about the 5 sec SE pause, it makes sense that with so many search engines that figure can be very low.

    There's 2 main ways to go about using SER it seems, lots of search engines and more threads, or less search engines and lower threads as @leeg described.

    Amazingly i'm not too far away from those figures myself...

    image

    Happy days :D


  • LeeGLeeG Eating your first bourne

    When you take the time and put the effort in, the feeling is good though :D

    I prefer to spend my time posting links, not seeing a screen full of already parsed results

    200 threads and my cpu is running at 95% all day long, though posting and not searching

  • goonergooner SERLists.com
    @leeg - I tried your way and for me it just burnt through my 30 proxies in a matter of days.

    I couldn't understand why until you mentioned "Upped my threads to 300. Google slaps all over the place for too many queries on both searches and recaptcha"

    Now it makes sense bcos i'm running 300 threads too... So might have to give it another shot with 200 threads only and see what happens.

    Cheers.
  • LeeGLeeG Eating your first bourne

    I was testing mega ocr at the time as recaptcha were bringing in the new captcha type

    Took my threads past my comfort zone, ended up with 300 threads running and low cpu

    Low cpu due to not posting much, just aimlessly searching

    Every time I was doing proxy tests, more were dropping out

    Backed it off to 200 and the slaps dropped right off and submissions improved again

     

    Verified is the end goal, LpM just looks good

  • LeeGLeeG Eating your first bourne

    Just another quick add to this

     

    Even using services that do recaptcha can also cause google slaps on proxies

    Mega Ocr are down at present, zero recaptcha being answered and no hard captchas being sent through

    No proxies with a slap

  • Thanks for all the great advice!  I've implemented several of them--specifically, checking the "Use URLs linking on same verified URL" and "restart proxies on active proxies" options among other things--and now my LPM is running at 32+ LPM (and with only 2 projects [I unchecked one of them]).  Anyways, think I'm going to buy a 2GB upgrade on my VPS (as suggested by a couple of folks) and see how that goes.  I'm going to keep a close eye on my verifies too since, as LeeG and others have said, that's what matters in the end.  Thanks once again!
  • I agree with fewer SE resulting in better results
    with SER or SB
    ideal would be one single SE-country per run

    yesterday I did a test using SB country specific only, multiple footprints
    only footprints that use inurl:
    only 2 or 3 footprints per SB scrape run

    1 SE = country specific
    same footprints using country tld in front of string following inurl: (providing it is a footprint that appears there on sites

    and G offered almost entirely all results from that country-tld and no or very few global tld (.com, .net, .org, etc)

    = resulting in more IP diversity and more different countries linking to a site with global visitors (currently some 200 countries / month)

    I use SER for submission only and create site-lists using SB

  • AlexRAlexR Cape Town
    @sven - when a proxy gets blocked by a SE, and you set it to disable, is that proxy still used for other tasks? E.g.you have 20 proxies, and all get blocked by Google, and SER disables them until they work. Are they still used for verifying or submitting (e.g. if its now using a sitelist)? 
  • This was very useful.
  • What does this setting do? "Use URLs linking on same verified URL"
  • Having hundreds of projects will help your LPM big time. When SER discovers a verified link, that. URL can be used across multiple projects. Actually, it can be used many times in the same project on lower tiers. Upping your email accounts for each project can help by allowing you to post to the same domain many times.
  • However in my opinion, posting same link on the same domain multiple times isn't advisable.
  • @Pratik ^^ agreed for tiers pointing to money site.... 
  • So, would you guys post the link multiple times on the same domain if it is not for Tier 1? I always thought it's not advisable?
  • I am NOT advising posting the same link to a page. Let's say that you have 200 tier 1 links for a project. You could theoretically use the same linking URL 200x on tier 2 without hitting the same target page twice. I actually cap it at 10x by using 10 email addresses per project. If you have 100 projects, and you build 10 tier 2 links with the same linking domain, that is 1000 links from 1 scraped URL. That's why lots of projects helps your LPM. Take a hard look at your verified list. If you have 10000 unique linking domains, you probably have hundreds of thousands of links.
Sign In or Register to comment.