As you see im really crushing my hardware here (it's my home-PC version). The sub/verified ratio is around 25%, but my index-tiers only get verified every 5000 minutes...
Im fighting that proxyproblem with feeding SER with Scrapebox lists. It's an additional 10 Minutes of work daily, but I think its a good workaround (and better than buying xx proxies per month). The Scrapebox pages are mostly fresh and virgine and therefore ripe to get spammed ...
Another thing: Why are some people not using the global site list? I think It's a great feature but would really like to know why you are avoiding it.
PS: I thought about only verifying the index/kitchen/whatever tiers every 5-7 days.. maybe its not smart because ser will NOT find a lot of the BLs (but they are there and just rolled off the page), they dont get send to lindexed etc. therefore...
You probably need to revisit the whole verified vs. submitted data and focus on the contextual engines. I'm sure that will help. I changed the footprints a long, long time ago, and have just kept what I figured out from back then. If you do the footprint thing, hopefully you have gscraper because that is a huge help in the evaluation process. I remember doing the contextuals a while back and only got some paltry number like 5 LPM. I can't remember what I changed or when as it was a while back. It was probably both of those things.
@pratik with respect u are concentrating on the wrong metric. U have only 306 verified links at the end if the day. Even if u get 300 lpm but low verified ratio your sites will not rank. U need a higher verified to submitted ratio
@sonic81 You're correct. I will take out some time and sort out the submitted vs verified lists very soon. So that should give high verified rate.
But please know that I can manually verify at the end of the day and always get about 3500+ verified, I don't set verification on my kitchen sinks (I verify it like every 4-5 days) so that decreases the verified rate.
Trying turning on verification automatically it will slow down your lpm a touch temporarily but u will get more accurate stats. i bet u will find you still get very low verifies
The threads are also low now that post to submitted sites is not ticked. Hmm. I'm now understanding how @spunko2010 had such situation of low threads, lol.
Kinda can't resist, let me enable post to submitted sites too and I'll check what is sub vs verified ratio tomorrow.
@Pratik, first make sure you keep track of your original engines vs. the ones you just killed as a result of your evaluation. Put it in notepad or your spreadsheet.
The one comment I would make is that when I evaluated verified vs. submitted, I had two criteria: 1) drop everything under 10% verified except for cases where => 2) the amount of verified was a large absolute number. So I used some qualitative judgment - I didn't want to throw away large amounts of links just because it had a crappy verified % rate.
As far as the sitelists, you already know I don't use them. I have to be honest here - over the last year since this forum started, I can't think of one issue that has caused more 'I have a problem' threads than those having to do with sitelists. Which is why I was so determined to see how fast SER could get without any assistance from sitelists or imported scraped urls. Not saying these things are evil or anything. Some guys like @doubleup and others just break the LPM meter with with scraped urls. But sitelists are a whole different animal.
I think what some of the guys do is run big scraped lists into a lower tier, let them process, and then it just grows their verified sitelists. I don't know if @Lee still does that, but he was messing around with that a number of months back.
One of things I want to point out is that @Lee probably has the LARGEST site list of any SER user. He was there at the beginning, and has trounced people with his LPM. So if he has been doing that for a year imagine how large his sitelist is.
My point @Pratik is that you are at the beginning. Your sitelist has barely begun. It's not going to make your LPM spinmeter break the glass if you know what I mean.
That's why I advocate (at least for the new guys) is to run SER without the sitelist. Try to learn the software. See what settings can make your LPM get high - without sitelists. Get better with your engine selections, etc. Then you are in a strong position to make it go even faster.
If you can do that successfully, then you can just imagine what will happen when your sitelist is much bigger - and you turn that on. Then you'll be chasing @Lee.
Also yeah, I did also saw the criteria that even if verified % is like 4-5% and it has huge number of verified URLs, I did leaved it checked as well, not just going by %.
Comments
With no global site lists or scraped lists:
Contextual Only - about 40 LPM
Contextual + Kitchen Sinks (everything) - about 140 LPM
Kitchen Sinks Only - about 200 LPM
@Pratik, first make sure you keep track of your original engines vs. the ones you just killed as a result of your evaluation. Put it in notepad or your spreadsheet.
The one comment I would make is that when I evaluated verified vs. submitted, I had two criteria: 1) drop everything under 10% verified except for cases where => 2) the amount of verified was a large absolute number. So I used some qualitative judgment - I didn't want to throw away large amounts of links just because it had a crappy verified % rate.
As far as the sitelists, you already know I don't use them. I have to be honest here - over the last year since this forum started, I can't think of one issue that has caused more 'I have a problem' threads than those having to do with sitelists. Which is why I was so determined to see how fast SER could get without any assistance from sitelists or imported scraped urls. Not saying these things are evil or anything. Some guys like @doubleup and others just break the LPM meter with with scraped urls. But sitelists are a whole different animal.
I don't use scraped lists. I always get better results with submitted and verified from the global lists
Submitted daily is 250k+ and about 50k+ verified daily
^^And that's coming from the LPM Master himself.
I think what some of the guys do is run big scraped lists into a lower tier, let them process, and then it just grows their verified sitelists. I don't know if @Lee still does that, but he was messing around with that a number of months back.
I tried it several times and each time it slowed submission too much for my liking
Set up a scrape, wait 24hrs, add a million or two links and then get poor submission results
Then found another use for gscraper, which was editing the footprints
Then binned that idea with the constant updates on the engines files
Too much like hard work keeping the engine files up to date
I get good speeds even with using mega ocr, you just need to know how to set a service like that up to integrate into ser for best performance
Im still testing and evaluating ideas and getting some good results as I build on those
But I pull results like this 7 days a week
As for global sites lists, I cant see why you guys resists in using them
Ron said they cause a lot of problems, when there are very few caused if any
The only thing you have to remember with global sites lists is deleted the duplicates on a regular bases
I killed about 4 million today
The scheduler was the main part of ser that caused problems and that's been bug free for months
One of things I want to point out is that @Lee probably has the LARGEST site list of any SER user. He was there at the beginning, and has trounced people with his LPM. So if he has been doing that for a year imagine how large his sitelist is.
My point @Pratik is that you are at the beginning. Your sitelist has barely begun. It's not going to make your LPM spinmeter break the glass if you know what I mean.
That's why I advocate (at least for the new guys) is to run SER without the sitelist. Try to learn the software. See what settings can make your LPM get high - without sitelists. Get better with your engine selections, etc. Then you are in a strong position to make it go even faster.
If you can do that successfully, then you can just imagine what will happen when your sitelist is much bigger - and you turn that on. Then you'll be chasing @Lee.
i mean how you determined this footprint is good and this is bad.