Can't increase my LPM past 24. Can anyone help?
Hi, newbie here. I've read a ton of the advice here (such as LeeG's and Wezz's posts). I've decreased the search engines to only 5 (all Google) and eliminated all the non-performing engines (both where the results were under 10 and also the ones where GSA Captcha Breaker can't solve well). And based on that advice, my LPM has increased from 1 to a high of 24+ LPM so thanks! I'm also using around 30 proxies from Buyproxies, and the only captcha service I'm using is GSA CB (though I have Captcha Sniper also). I'm doing Tiers 2 and 3 of 3 projects.
However, I've been reading about people getting 50+ or even 100+ LPM, and I'm like, How do they do that?!! I mean, I've followed most of the advice already. Can anyone help me out? I've included my settings here. Maybe it's my crappy Berman hosting (only 1GB); maybe I should increase to 2GB? Anyways, here are my settings. Thanks a ton!
However, I've been reading about people getting 50+ or even 100+ LPM, and I'm like, How do they do that?!! I mean, I've followed most of the advice already. Can anyone help me out? I've included my settings here. Maybe it's my crappy Berman hosting (only 1GB); maybe I should increase to 2GB? Anyways, here are my settings. Thanks a ton!
Comments
I tried selecting 5 search engines too and it burnt out my proxies very quickly, so now i select all search engines and leave it as that. How does that affect LPM? I don't really know to be honest but if your proxies are blocked then you will definitely get a lower LPM.
There is a new option in the project settings, "Use URLs linking on same verified URL" try this option on, it increased my LPM from around 70-80 up to 100-130. Worked great for me.
You don't have "custom time to wait between search engines" checked, maybe put this at 90 - 120 to protect proxies.
You have checked "stop projects on no active projects" in that case you should probably check "restart proxies on active proxies", that way if a project stops because of no active proxies, it should restart automatically once a good proxy is available.
Hope that helps
EDIT:
As @donaldbeck said, defo uncheck public proxies, i didn't see that.
I use VPS and can get good speeds, 4gb ram on mine and i use SER to search targets too, i haven't edited any engines or de-selected any.
Having lots of projects really helps a lot with speed also.
In my opinion "continuously post to a site even if it failed" will slow down LPM, not speed it up.
I'd also look at 2 or even 4gb memory as with 30 proxies, you could push your thread count up to 300. You need to find a balance with search engines, as using too many will end up causing "Already Parsed" messages as it find sites that have already been found by other search engines.
You also got to remember that LPM changes with each new version of SER, and it's not something you should measure against what others achieve as all of our set-ups are different. Yes, it's great to be hitting 150 LPM, but actually, what really matters is your ranking.
^^Everybody above was correct.
- Your LPM will go through the roof with more projects.
- Only use private proxies. Leave public proxies for people who are just starting out on a budget.
- RAM matters. If you can afford a set-up with 4 MB, then please do it. If not, you will be fine with 2MB. When you get over 20 projects or so, start using the scheduler, and do 10 projects at a time for 20 minutes each.
- On 30 proxies, I have it set at 300 threads, 120 sec html, 5 sec SE pause. I can easily set it at 5 seconds pause between queries because of the next point...
- I use about 120 search engines, and I never burn out proxies. You can right click on the SE box, and select only English speaking SE's, and you will be in great shape.
LpM is not the be all and end all, verified is your target
I can do low LpM (80 ish) and still have 50k verified and on a good day, well over 100k verified
I personally run 200 threads, 40 shared proxies and five random googles
That way, Im always pulling more fresh sites to post on than repeat results using more engines
Less search engines, less repeat results. in an ideal world, I would run one search engine
Watch your stats and look at the engines you use that give a lot of verified, then bin the ones that give poor results
I did some testing recently. Upped my threads to 300. Google slaps all over the place for too many queries on both searches and recaptcha. Running 300 threads might sound good, but the end result, is a total and utter joke. Poor submission rates where the proxies were slapped silly
Needless to say, I went back to my trusty settings and I still bang out daily verified like this.
These stats are verified and not submitted
There's 2 main ways to go about using SER it seems, lots of search engines and more threads, or less search engines and lower threads as @leeg described.
Amazingly i'm not too far away from those figures myself...
Happy days
When you take the time and put the effort in, the feeling is good though
I prefer to spend my time posting links, not seeing a screen full of already parsed results
200 threads and my cpu is running at 95% all day long, though posting and not searching
I couldn't understand why until you mentioned "Upped my threads to 300. Google slaps all over the place for too many queries on both searches and recaptcha"
Now it makes sense bcos i'm running 300 threads too... So might have to give it another shot with 200 threads only and see what happens.
Cheers.
I was testing mega ocr at the time as recaptcha were bringing in the new captcha type
Took my threads past my comfort zone, ended up with 300 threads running and low cpu
Low cpu due to not posting much, just aimlessly searching
Every time I was doing proxy tests, more were dropping out
Backed it off to 200 and the slaps dropped right off and submissions improved again
Verified is the end goal, LpM just looks good
Just another quick add to this
Even using services that do recaptcha can also cause google slaps on proxies
Mega Ocr are down at present, zero recaptcha being answered and no hard captchas being sent through
No proxies with a slap
with SER or SB
ideal would be one single SE-country per run
yesterday I did a test using SB country specific only, multiple footprints
only footprints that use inurl:
only 2 or 3 footprints per SB scrape run
1 SE = country specific
same footprints using country tld in front of string following inurl: (providing it is a footprint that appears there on sites
and G offered almost entirely all results from that country-tld and no or very few global tld (.com, .net, .org, etc)
= resulting in more IP diversity and more different countries linking to a site with global visitors (currently some 200 countries / month)
I use SER for submission only and create site-lists using SB