Skip to content

LPM: Running No Global Lists

Hi guys,

What's everyone's LPM lately when running no global lists?  I've noticed a sharp detoriation in SER performance lately.  Here is what I've done

1) Running 50 private proxies and I'm finding they get banned pretty fast so search ends up up over pretty quickly
2) Even when carefully rotating my googles and buying new proxies to minimize bans, find that GSA isn't scraping my targets and I'm getting like 100 submits an hour only for my tier 2 campaign which is running an optimized identified / verified engines.
3) I've loaded up gscraper scraping a 200k url list with the footprints in the engines, i end up with a few only 200 submits.  Very poor hit rate.

Has anyone noticed simliar things?
«1

Comments

  • I've those messages sprouting up a lot lately. Honestly I don't care and let that beast run.

    When new day starts, I stop the projects and verify all the submitted links. So that way in 45 minutes or so, the proxies can get rest too and possibly get unbanned and the spare time is used to verify the submitted links and send them for indexing.

    I noticed twice today after I gave it a break, my LPM stayed above 100 for like 10 minutes! Then around 38-40.
  • what's people's verified rate? i can't get it above 5%
  • I'm getting the same problem also.

    GSA SER is scraping VERY LITTLE NEW targets every day.

    The first couple days on the campaign it'll have about 200-300 verified.  Then afterwards it's battle up the hill very little.  The problem is only having a couple hundred unique domains which isn't good if you're point it to the money site.

    I'm using also 400k kw list, each 1-2 words.  Also optimized my engines with high verified ones.
  • ronron SERLists.com

    I don't use any sitelists or imported targets:

    image

    Right now it's just past Noon, so this is a half day. I don't verify anything other than T1, and all other tiers and junk tiers are disabled for verification - I verify those once a week. So that's why you see a low verified.

    I typically average 15% - 20% in total for all tiers. I know that people like @Lee have invested a ton of time in figuring out platforms, engines, settings, etc., and has posted on this board proof that he is hitting a crazy % verified, like 35% verified. So it requires analysis, tinkering, logic, and persistence. Remember, some platforms yield higher %verified than others. So all this factors in to the mix.

  • Also keep in mind that your VPS or Server, makes a big difference, on one of my xrumer dedi servers that cost $447 mo, I got about 250-300 LM on trackbacks with a raw list the other day, on a typical 100 $ - 150 mo vps, i might of gotten max 50-100.
  • My LPM is shit currently, got decreased instead of increasing after adding more projects. I'm doing on 21 projects at 21-22 LPM, so low. Even before adding those 3-4 projects, it actually is decreased since past 3-4 days. 50 is dream now. Not sure what's wrong, are proxies dying? Maybe I don't think so, I give them usually 2 hours rest daily.
  • use the scheduler to see if it makes a difference.
  • @ron Do you still use scheduler with 10 project together ?
  • ronron SERLists.com

    Yeah I use 10 projects every 20 minutes.

    @Pratik, I think the real issue is the number of projects. I know when I have fewer projects, my LPM goes down. I have about 110 projects, so I have a lot of food for SER.

  • @ron Ahaha, I thought lower for you was like 5-6 projects but boy I was wrong. I thought having 27-30 projects are considered as "high", *sigh* (on me). :(

    Btw, the VPS's CPU maxes out to around 97% quite a few times, so having 60-70 projects, can SER and my VPS handle it? I always thought it was an overkill and I might need to get multiple licenses of SER.

    My specs are btw, 2 GB DDR3 RAM, 70 GB HDD, 3.3GHz.

    What are yours?

    Thank you.
  • ronron SERLists.com

    I have 6 GB RAM, 60 GB HDD, 2.0 Ghz E5 Xeon

    With the scheduler set as mentioned above, I average only 50% CPU.

    Task manager has my average memory for SER at 400 MB.

    Task Manager has my Total Memory at 1.2 GB, so I obviously have a lot of room.

    In the beginning I *used to* run all projects at once. But as I added projects it was not feasible. Hopefully you are using the scheduler. If not, please do so. It requires just a fraction of the CPU and Memory compared to running everything simultaneously.

  • edited July 2013
    @ron

    I tried scheduler couple of days ago but I thought it was not for me.

    Now that my projects are increasing, I think I might give this a go today.

    How about your LPM when you used to run all projects at once? Was it higher than now?

    Thank you.

    edit: Also I assume those specs are for your VPS and not own PC?
  • ronron SERLists.com

    Yeah VPS. My LPM is consistently higher with scheduler.

    I used to feel the same way about the scheduler. The problem I started running into, as my projects got bigger, was that I had to split my runs so I wouldn't overwhelm my RAM and CPU. It just became stupid, and I was babysitting all the time.

    I think the reason I shied away from it (9 months ago) was that it was new and some people were crashing all the time (while I wasn't crashing). So of course I stayed away from it. But that was eons ago. Now I would be lost without it.

  • edited July 2013
    @ron

    You were damn right man! Been running scheduler since past 1 hour (the same as yours, 10 projects for 20 minutes) and I'm now see the result:

    image

    Whopping 59-60 LPM! That is highest maintained LPM personal record. Now I think anyone who is on more than 15 active projects should definitely give scheduler a try and you are likely to see vast increase in LPM.

    I was doing 20-22 in recent days with increase of projects, and I'm now 58-60.

    I'll see if it stays the same or even above 50+ all day long with scheduler as it maybe too early to judge.

    Although it's going down slowly, now around 57.90 but it might be a temporary slowdown, hoping for the best though!

    edit 2: Now increasing again, increased to 58.80! Nice.

    Thank you. :)
  • AlexRAlexR Cape Town
    @ron - do you reverify your T0 contextual links or do you also only do this once per week?
  • 3.4% verified that's pretty low no?  if u can get your verified to 20%  that's better thats equivalent to getting your lpm from 20 - 120.  
  • ronron SERLists.com

    @AlexR - I only verify T1 daily as they go straight to the money sites. Everything else I have at once a week just because I want SER posting more and verifying less.

    @Pratik - You will find with SER is that things will get faster over time. You'll have more projects, you'll get better at picking the engines to use, and like everyone else, you will stumble onto settings and combinations of things that makes SER go faster. Keep in mind that my LPM is never stable. One day it's one number, the next day it's different. There are so many factors like connection speed, which sites are getting today's posts, and website server issues that can affect LPM.

  • edited July 2013
    @ron and everyone, this is my result for today's entire day taken at 11:59 PM:

    image

    This is my highest record, 73.1K submitted today! :D

    The last best personal record was around 52K, and I maintained all day long 50+ LPM. This is all due to switching to use scheduler, otherwise I was hanging around 21-22 LPM.

    And I'd be at around 75K or 76K but I give a good 2 hours rest to proxies, so that was result of 22 hours to be specific.

    Thanks for the great advise. :)
  • ronron SERLists.com

    Screw resting the proxies. You take a nap if you need to, but your proxies don't need a break. Let them rip 24/7. You'll be fine.

  • AlexRAlexR Cape Town
    @ron - thanks for response. I wanted to ask you about your T1 contextual links. I have a domain PR filter of 2 and I just can't seem to hit my low targets for the day (40 contextual submissions per project per day), while all my other type projects are hitting their targets  (so proxies and various settings must be fine). I can either lower the PR or I have to start scraping and importing. Have you had this issue or how do you get around this?
  • @ron

    Actually, I just don't sit idle in those 2 hours, I set the projects on Active Verification [Active(V)] so during those 2 hours, it verifies all submitted (pending) links.

    So that would make proxies take a rest from scraping search engines for good 2 hours as well as in that time, my other work of verifying submitted links is carried out, so I'm pretty satisfied. :D
  • edited July 2013
    @pratik congrats on getting your submit rate up.  But with 1108 verified over 73136 submitted that's a verified rate of 1.5%  At the end of the day you only have 1108 links submitted after running a day.  At the end of the day we are all trying to maximize verified links not submitted links.

    I think you are running global lists and experiencing the problem I detailed here: https://forum.gsa-online.de/discussion/5046/i-think-i-found-the-cause-for-low-verified#latest

    image


    After I have fixed the problem manually this is what I get about 17% verified rate halfway thru running 13k submitted.  Still puny compared to @leeg and @ron :)


  • edited July 2013
    @sonic81 Definitely, but I manually verify all submitted (pending links) at the start of the new day, so proxies get break too from scraping as I mentioned above, and on verifying it I got more than 6K verified, but 7-8K in total is still less.

    How did you do it if you don't mind sharing it?

    Thank you.

    I'd also say that I don't verify kitchen sinks, but just main tiers, in case you verify those too.

    Cheers.
  • edited July 2013
    Btw,

    Today is better than yesterday! It's been some 4 hours since I started the campaigns for today and here's the result:

    image

    :D
  • @pratik, are your LpMs constant throughout the day? Also you schedule your projects to change every 30minutes?
  • At some point within 24 hours you will have to verify right?  have you got stats where you actually see the verifies shoot up?  My data shows running it that way you will always have only 292 links verified it never goes up all the other 17918 submits don't verify....

    This is because when you are running it like this you are resbumitting to the same site over and over again.  it won't allow more than a few links you end up wasting SER resources submitting to sites that don't verify.

    i clean my target url history and update the urls to get my verifies it's the cleanest method but it works for me.  this is the only workaround i have found.

    my theory is in tier 2 and lower clearing target url history is fine when u have a 10:1 ratio mathematically you won't be submitting to the same site which points back to the same tier 1 link.
  • @stranger I will say it is quite near. Yesterday, I originally started around 58-59 but the whole day I maintained at least around 52 or so.

    Now, for today, it's been about 3 hours after I posted that last screenshot and my LPM varies around 63-64, so yeah, it's pretty stable!

    @sonic81 You clear target URL history daily? I do verify ALL submitted links (throughout the previous day) at the start of next day, so that does two things:

    1. Once, I can verify those submitted links and make things little more clear.
    2. It also gives a good 2 hour rest to proxies for scraping.

    So you delete target URL cache every day? And for all projects? Including kitchen sinks too?

    Cheers.
  • @pratik only for kitchen sinks.

    if u go to advanced -> tools - > delete uplicate urls you will see a lot of duplicates i bet you  are resubmitting to the same sites the global list is going haywire with lots of duplicates.
  • @sonic81

    Good point. At end of today, when I'm about to verify submitted links throughout the day, I'll give that thing a check!

    Cheers.
  • ronron SERLists.com
    @AlexR - I set a domain PR of 1 on my T1's and have been hitting the targets. You might want to run the T1's separately at some point each day, give them their own space on the scheduler, and see if that helps.
Sign In or Register to comment.