LPM: Running No Global Lists
Hi guys,
What's everyone's LPM lately when running no global lists? I've noticed a sharp detoriation in SER performance lately. Here is what I've done
1) Running 50 private proxies and I'm finding they get banned pretty fast so search ends up up over pretty quickly
2) Even when carefully rotating my googles and buying new proxies to minimize bans, find that GSA isn't scraping my targets and I'm getting like 100 submits an hour only for my tier 2 campaign which is running an optimized identified / verified engines.
3) I've loaded up gscraper scraping a 200k url list with the footprints in the engines, i end up with a few only 200 submits. Very poor hit rate.
Has anyone noticed simliar things?
Comments
I'm using also 400k kw list, each 1-2 words. Also optimized my engines with high verified ones.
I don't use any sitelists or imported targets:
Right now it's just past Noon, so this is a half day. I don't verify anything other than T1, and all other tiers and junk tiers are disabled for verification - I verify those once a week. So that's why you see a low verified.
I typically average 15% - 20% in total for all tiers. I know that people like @Lee have invested a ton of time in figuring out platforms, engines, settings, etc., and has posted on this board proof that he is hitting a crazy % verified, like 35% verified. So it requires analysis, tinkering, logic, and persistence. Remember, some platforms yield higher %verified than others. So all this factors in to the mix.
Yeah I use 10 projects every 20 minutes.
@Pratik, I think the real issue is the number of projects. I know when I have fewer projects, my LPM goes down. I have about 110 projects, so I have a lot of food for SER.
I have 6 GB RAM, 60 GB HDD, 2.0 Ghz E5 Xeon
With the scheduler set as mentioned above, I average only 50% CPU.
Task manager has my average memory for SER at 400 MB.
Task Manager has my Total Memory at 1.2 GB, so I obviously have a lot of room.
In the beginning I *used to* run all projects at once. But as I added projects it was not feasible. Hopefully you are using the scheduler. If not, please do so. It requires just a fraction of the CPU and Memory compared to running everything simultaneously.
Yeah VPS. My LPM is consistently higher with scheduler.
I used to feel the same way about the scheduler. The problem I started running into, as my projects got bigger, was that I had to split my runs so I wouldn't overwhelm my RAM and CPU. It just became stupid, and I was babysitting all the time.
I think the reason I shied away from it (9 months ago) was that it was new and some people were crashing all the time (while I wasn't crashing). So of course I stayed away from it. But that was eons ago. Now I would be lost without it.
@AlexR - I only verify T1 daily as they go straight to the money sites. Everything else I have at once a week just because I want SER posting more and verifying less.
@Pratik - You will find with SER is that things will get faster over time. You'll have more projects, you'll get better at picking the engines to use, and like everyone else, you will stumble onto settings and combinations of things that makes SER go faster. Keep in mind that my LPM is never stable. One day it's one number, the next day it's different. There are so many factors like connection speed, which sites are getting today's posts, and website server issues that can affect LPM.
Screw resting the proxies. You take a nap if you need to, but your proxies don't need a break. Let them rip 24/7. You'll be fine.
With no global site lists or scraped lists:
Contextual Only - about 40 LPM
Contextual + Kitchen Sinks (everything) - about 140 LPM
Kitchen Sinks Only - about 200 LPM
@Pratik, first make sure you keep track of your original engines vs. the ones you just killed as a result of your evaluation. Put it in notepad or your spreadsheet.
The one comment I would make is that when I evaluated verified vs. submitted, I had two criteria: 1) drop everything under 10% verified except for cases where => 2) the amount of verified was a large absolute number. So I used some qualitative judgment - I didn't want to throw away large amounts of links just because it had a crappy verified % rate.
As far as the sitelists, you already know I don't use them. I have to be honest here - over the last year since this forum started, I can't think of one issue that has caused more 'I have a problem' threads than those having to do with sitelists. Which is why I was so determined to see how fast SER could get without any assistance from sitelists or imported scraped urls. Not saying these things are evil or anything. Some guys like @doubleup and others just break the LPM meter with with scraped urls. But sitelists are a whole different animal.
I don't use scraped lists. I always get better results with submitted and verified from the global lists
Submitted daily is 250k+ and about 50k+ verified daily
^^And that's coming from the LPM Master himself.
I think what some of the guys do is run big scraped lists into a lower tier, let them process, and then it just grows their verified sitelists. I don't know if @Lee still does that, but he was messing around with that a number of months back.
I tried it several times and each time it slowed submission too much for my liking
Set up a scrape, wait 24hrs, add a million or two links and then get poor submission results
Then found another use for gscraper, which was editing the footprints
Then binned that idea with the constant updates on the engines files
Too much like hard work keeping the engine files up to date
I get good speeds even with using mega ocr, you just need to know how to set a service like that up to integrate into ser for best performance
Im still testing and evaluating ideas and getting some good results as I build on those
But I pull results like this 7 days a week
As for global sites lists, I cant see why you guys resists in using them
Ron said they cause a lot of problems, when there are very few caused if any
The only thing you have to remember with global sites lists is deleted the duplicates on a regular bases
I killed about 4 million today
The scheduler was the main part of ser that caused problems and that's been bug free for months
One of things I want to point out is that @Lee probably has the LARGEST site list of any SER user. He was there at the beginning, and has trounced people with his LPM. So if he has been doing that for a year imagine how large his sitelist is.
My point @Pratik is that you are at the beginning. Your sitelist has barely begun. It's not going to make your LPM spinmeter break the glass if you know what I mean.
That's why I advocate (at least for the new guys) is to run SER without the sitelist. Try to learn the software. See what settings can make your LPM get high - without sitelists. Get better with your engine selections, etc. Then you are in a strong position to make it go even faster.
If you can do that successfully, then you can just imagine what will happen when your sitelist is much bigger - and you turn that on. Then you'll be chasing @Lee.
i mean how you determined this footprint is good and this is bad.