Skip to content

LPM: Running No Global Lists

2»

Comments

  • edited July 2013
    image

    As you see im really crushing my hardware here (it's my home-PC version). The sub/verified ratio is around 25%, but my index-tiers only get verified every 5000 minutes...

    Im fighting that proxyproblem with feeding SER with Scrapebox lists. It's an additional 10 Minutes of work daily, but I think its a good workaround (and better than buying xx proxies per month). The Scrapebox pages are mostly fresh and virgine and therefore ripe to get spammed ...

    Another thing: Why are some people not using the global site list? I think It's a great feature but would really like to know why you are avoiding it.

    PS: I thought about only verifying the index/kitchen/whatever tiers every 5-7 days.. maybe its not smart because ser will NOT find a lot of the BLs (but they are there and just rolled off the page), they dont get send to lindexed etc. therefore...

  • My lpm is crap if i run only conextual projects without global list.

    @ron what lpm you get running only contextual projects without list.
  • I can't get above 10 lpm if i turn off global lists.
  • same happens to me if i dont use kitchen sink of course.

  • ronron SERLists.com

    With no global site lists or scraped lists:

    Contextual Only - about 40 LPM

    Contextual + Kitchen Sinks (everything) - about 140 LPM

    Kitchen Sinks Only - about 200 LPM

  • yes what im trying to achieve is those 40lpm you do using only contextual, with Kitchen sink i cant pass the 100mark.

    mm any special settings you use to achieve 40lpm with contextual?... i dont know what else to do, maybe i need to try adding and removing footprints?
  • ronron SERLists.com
    You probably need to revisit the whole verified vs. submitted data and focus on the contextual engines. I'm sure that will help. I changed the footprints a long, long time ago, and have just kept what I figured out from back then. If you do the footprint thing, hopefully you have gscraper because that is a huge help in the evaluation process. I remember doing the contextuals a while back and only got some paltry number like 5 LPM. I can't remember what I changed or when as it was a while back. It was probably both of those things.
  • And lol, having removed duplicate URLs, I'm down to 27-28 LPM.
  • Now increased to around 43 LPM but I also increased PR filters to at least 1 on Tier 1 today, so probably that's why. Still satisfied.
  • edited July 2013
    Update 2: I was actually able to push to 51+ LPM (automatically) from 42 posted previously despite clearing out duplicate URLs from all saved URLs:

    image

    So I think considering that, it's still pretty cool. And the LPM is only increasing. Right now, after taking screenshot of above image I'm on 51.92. :)

    Cheers.
  • edited July 2013
    thanks @ron will try to focus on footprints and see what i can get... by the way i meant i CAN pass 100lpm using kitchen sink.

    hey @patrikif you are not using a list, why removing duplicate domains or url on list will affect your lpm?
  • @pratik with respect u are concentrating on the wrong metric. U have only 306 verified links at the end if the day. Even if u get 300 lpm but low verified ratio your sites will not rank. U need a higher verified to submitted ratio
  • @rodol I indeed use them.

    @sonic81 You're correct. I will take out some time and sort out the submitted vs verified lists very soon. So that should give high verified rate.

    But please know that I can manually verify at the end of the day and always get about 3500+ verified, I don't set verification on my kitchen sinks (I verify it like every 4-5 days) so that decreases the verified rate.

    Cheers.
  • ronron SERLists.com
    @rodol If anything it reduces the ser file sizes and ram, so maybe a bit of efficiency there. I don't think it helps with LPM.
  • Trying turning on verification automatically it will slow down your lpm a touch temporarily but u will get more accurate stats.  i bet u will find you still get very low verifies
  • edited July 2013
    Alright, I got shot down today.

    I analysed by taking time today submitted vs verified ratio and made the appropriate changes. I trashed less than 7% sub:ver engines.

    Next, I also made change by editing projects to not to submit to submitted sites but just verified sites.

    And my LPM is shit today, lol, 10.45.

    Any ideas @ron?

    The threads are also low now that post to submitted sites is not ticked. Hmm. I'm now understanding how @spunko2010 had such situation of low threads, lol.

    Kinda can't resist, let me enable post to submitted sites too and I'll check what is sub vs verified ratio tomorrow.

    Cheers.
  • ronron SERLists.com
    edited July 2013

    @Pratik, first make sure you keep track of your original engines vs. the ones you just killed as a result of your evaluation. Put it in notepad or your spreadsheet.

    The one comment I would make is that when I evaluated verified vs. submitted, I had two criteria: 1) drop everything under 10% verified except for cases where => 2) the amount of verified was a large absolute number. So I used some qualitative judgment - I didn't want to throw away large amounts of links just because it had a crappy verified % rate.

    As far as the sitelists, you already know I don't use them. I have to be honest here - over the last year since this forum started, I can't think of one issue that has caused more 'I have a problem' threads than those having to do with sitelists. Which is why I was so determined to see how fast SER could get without any assistance from sitelists or imported scraped urls. Not saying these things are evil or anything. Some guys like @doubleup and others just break the LPM meter with with scraped urls. But sitelists are a whole different animal. 

  • LeeGLeeG Eating your first bourne

    I don't use scraped lists. I always get better results with submitted and verified from the global lists

    Submitted daily is 250k+ and about 50k+ verified daily

  • ronron SERLists.com
    edited July 2013

    ^^And that's coming from the LPM Master himself.

    I think what some of the guys do is run big scraped lists into a lower tier, let them process, and then it just grows their verified sitelists. I don't know if @Lee still does that, but he was messing around with that a number of months back.

  • LeeGLeeG Eating your first bourne
    edited July 2013

    I tried it several times and each time it slowed submission too much for my liking

    Set up a scrape, wait 24hrs, add a million or two links and then get poor submission results

    Then found another use for gscraper, which was editing the footprints

    Then binned that idea with the constant updates on the engines files

    Too much like hard work keeping the engine files up to date

    I get good speeds even with using mega ocr, you just need to know how to set a service like that up to integrate into ser for best performance

    Im still testing and evaluating ideas and getting some good results as I build on those

     

    But I pull results like this 7 days a week

    image

     

    As for global sites lists, I cant see why you guys resists in using them

    Ron said they cause a lot of problems, when there are very few caused if any

    The only thing you have to remember with global sites lists is deleted the duplicates on a regular bases

    I killed about 4 million today :D

    The scheduler was the main part of ser that caused problems and that's been bug free for months 

  • ronron SERLists.com
    edited July 2013

    One of things I want to point out is that @Lee probably has the LARGEST site list of any SER user. He was there at the beginning, and has trounced people with his LPM. So if he has been doing that for a year imagine how large his sitelist is.

    My point @Pratik is that you are at the beginning. Your sitelist has barely begun. It's not going to make your LPM spinmeter break the glass if you know what I mean.

    That's why I advocate (at least for the new guys) is to run SER without the sitelist. Try to learn the software. See what settings can make your LPM get high - without sitelists. Get better with your engine selections, etc. Then you are in a strong position to make it go even faster.

    If you can do that successfully, then you can just imagine what will happen when your sitelist is much bigger - and you turn that on. Then you'll be chasing @Lee.

  • @ron Absolutely wonderful post(s). Yes, I definitely understand your point.

    @LeeG thanks for your opinions too. :)

    I will try to see and optimize things again if and where possible.
  • Also yeah, I did also saw the criteria that even if verified % is like 4-5% and it has huge number of verified URLs, I did leaved it checked as well, not just going by %.

    Cheers.
  • edited July 2013
    LeeG could you please share with us how you use Gscraper for evaluating footprints?
    i mean how you determined this footprint is good and this is bad.
Sign In or Register to comment.