Skip to content

GSA works fine, but eventually gets very slow and never recovers

135

Comments

  • LeeGLeeG Eating your first bourne

    I went through my engines again last night.

    Most of the engines Im using were two short and one needed pulling because its not supported by CB

    Not bad for not using the nuclear science zombie killing degree method (the walking dead is back on TV hence my fascination with zombies)

    I compared submission speeds in  5.17 to 5.03

    5.03 rocks I was running at over 200LpM earlier with that version

    5.17 is lucky to hit 120LpM

     

  • ronron SERLists.com

    I have to check that out against my current list.

  • @Ron- why would we have to change anything when it was working just fine till the recent updates.
  • ronron SERLists.com

    What LeeG is talking about is essentially using a second method to identify which engines you could remove from your projects in SER.

    He is simply looking at what CS was not supporting, and pulling out those engines from his projects. Because if they are not supported, you won't get links from there.

  • Ok, but what change from the last version that worked to this version that has a low LpM.
  • LeeGLeeG Eating your first bourne

    I have found the same with poor LpM

    Look at the difference in speed I noted between two versions

    Im lucks, I back up exe´s from time time, so always have a fall back

    CPU usage is not consistent with the meter

    In the older version, I kill my vps cpu

    In the new version, if you monitor cpu with resource meter, its up and down

  • I am still wondering...... do I check verified aganst submitted or identified.................. what makes more sense?.
  • ronron SERLists.com

    Use verified divided by identified. I should have mentioned to skip submitted as that's what ends up getting verified, and then that number reduces when SER either finds the link or can't. Either way, that column doesn't help you out.

  • LeeGLeeG Eating your first bourne

    Or, if you use cb, just use the engines cb does captchas for and you wont be far off

    Less technical skills with girly type word processing stuff needed :D

  • yes. makes sense :-( i>s>v ! so best v/I ...... thank you. will redo math.
  • LeeGLeeG Eating your first bourne

    Two calculations

    girly pansy methods + time = less tv and beer drinking time

    time + effort + brain + spending time working on bits that matter = high lpm and verified

    Plus the next ser release will alter all the calculations anyway with the search method Ozz has got added

  • I see people saying their lpm is over 100... how are you achieving this?  Mine has been stuck under 1 for a couuple days now.  Some of my active projects in there are new with barely any links so I know they haven't run out of keywords.  Right now my lpm is 0.70 with 730 submissions for the day and 33 verifications.
  • Nevermind, I think mine was so slow because I was using filters such as less than 50 outbound links, etc...

    When I added some new campaigns and didn't put any sort of filters on them GSA started flying and my LPM went way up.
  • just curious i posted something on another thread reposting here.  i've narrowed a lot of my issues to dead proxies from buyproxies for certain googles is everyone on this thread experiencing issues using buyproxies semi-dedicsted?

    i'vebeen having major problems with my buy proxies in the last 2 weeks i think that's been the source ofmy problems. there is no easy way to tell if a proxy has been banned except for searching my logs and seeing 000/000 constantly appearing.

    i was running at 60 lpm this morning and this afternoon went to 4lpm.  lots of proxy bans.  i'm running 40 shared proxies using buy proxies just curious for everyone who has had low LPM issues recently were they all using buy proxies?
  • edited February 2013
    are you using high PR filter?

    I was doing some test with scheduler, i have some high pr projects PR2 PR3 filters, and i use proxies for everything, using scheduler my high pr project where all active at the same time, 3 of my proxies got banned from google, but now they are alive again.

    i am using buyproxies.
  • ronron SERLists.com

    @sonic81 - 9.9 times out of 10 the issues really aren't the proxies. It's usually the settings somewhere in SER that is causing a drag.

    @rodol provided an excellent example of how proxies get blamed for the problem when in actuality the OBL filter is the root cause of the aggravation.

  • AlexRAlexR Cape Town
    edited February 2013
    I would say there are 3 issues that cause this:
    1) Proxies that go dead. (Hence it would be nice for SER to check proxies auto every so often and disable bad ones. Then on next check they may have recovered and can be renabled. So...if you're getting 25% of proxies temp banned LPM will decrease by 25% for that period. Why not have it auto remove these and add them back when not banned)
    2) Issue 2 is keywords and SE results overlap. Having 100k keywords is great, but if they just generate the same SE results you're wasting time parsing and getting "already parsed". I.e. how different are results for "blue widget" and "Blue widget 2013"???Still no workaround or tool for us to remove keywords that overlap too much. (@LeeG - how are you getting around this?)
    3) It's running out of targets. Would be nice to have an option for it to also use "related searches" to expand the number of results generated from a set of keywords. 
  • You guys keep pointing out proxies, but how can it be proxies, if the problem goes away as soon as i clear url history ? If the proxies where dead they would still be dead after that, or am i missing something here.

    About kw or running out of targets, if i am using the global lists and i have unchecked "avoid posting to same domain", shouldnt the application endlessly blast the list repeatedly, or it does only one pass and then goes hunting with the kw provided ?
  • LeeGLeeG Eating your first bourne

    You say clearing the url history speeds it up?

    Sounds like you need to add a lot more keywords

    What your doing is deleting any history of where its already been.

    So it visits the same places again and again after clearing that

  • OzzOzz
    edited February 2013
    to clear your URL history is obviously increasing your submission rate because you won't get "already parsed" messages in the beginning and SER can use your global lists with URLs it has already identified or submitted to earlier. SER don't need to spend time for searching new target URLs because of this.

    after a while SER is done with (part of) the lists and tries to search for new target URLs. but when your proxies are banned from searching than you won't get new target URLs which results in a low submission rate.

    if you just want to use just your lists than disable SEs for a while and see how far you can go with that. just make sure to delete duplicated URLs before doing this to speed things up.

  • edited February 2013
    Yes i've been saying that in every post i make in this thread (but as much as i love all the info in this thread, it is basically drifting from the OP), and again i tried with 2k or 100k kw, even with kw overlapping, i'm pretty sure 100k should yield more results than 2k, or a least i should notice the slowdown occuring much later.

    Either way no one seems to know the answer to my question about what happens when the global list is fully posted and i have unchecked "avoid posting to same domain". any ideas on that one ?

    Keep in mind one more thing, the slowdown is the threads going down, (sub30), but the cpu is always at 99% in the interface, if the threads and the cpu would fade in a similar fashion it would seem a lot normal to me.

    Going to try and run without SE's then, but funny thing is when the app is grinding to a halt i check the proxies with google (google.com and look for the word "about") and they come out OK, that is what really puzzles me atm.
  • LeeGLeeG Eating your first bourne

    What are you using to scrape keywords?

    If its scrapebox, you will scrape a lot of the same type of words

    So your target finding will be extremely limited, even with 100k words

    Be the same results, time and time again

  • OzzOzz
    edited February 2013
    i reread your OP again and you are saying that your proxies are OK. are they OK for posting and for searching or do you observe many "000/000" messages in your log when SER is searching?

    also you've never posted a log which might help to figure out your issue, if i'm not wrong.
  • Yes i am scraping with SB, so i'm bound to have lots of similar KW, i'm looking into something like the google wheel, that would put out related but not similar results (like http://www.instagrok.com/ for example), just analyzed a old log that i got and i have +-15 000/000 msg for every minute, is that excessive or an indicator that the proxies are not good for search ? What url and words are guys using to check if the proxy is banned for search (assuming everyone is using google to search that is) ?

    Yes i haven't uploaded yet, but i have uploaded the zipped log (1h) here

    Ty for the help.
  • AlexRAlexR Cape Town
    I also find my threads drop down drastically and stay down...on my to do to resolve...so watching closely here!
  • 000/000 msg its an indicator that you are not scraping new urls to post, maybe you are using a lot of repeated keywords.

    something thta have worked for me is, go to google keyword tool, search for your niche keyword, and use the 800 keywords GKT gives you, and try with that.
  • LeeGLeeG Eating your first bourne

    I have been getting a lot of crashes myself recently

    About every 8 hrs

    My LpM has been running low at about 140 to 160. Which is very low by my standards. I push 200LpM+ most days

    My crashes normally happen when its pulling site lists and verifying links.

    This is what I did last night, four stage attack

    Stage one

    Delete all global site lists. I only use submitted and verified

    Stage two

    Recreate global site lists via tools on the advanced menu

    Stage three

    Delete duplicate urls from those lists

    Stage four

    Delete target url cache

    Touch wood, its been running all night with a health LpM and submitted rate

  • Just a heads up on this, i removed all search engines from all my projects, so there is no scraping going on whatsoever, but the sympthoms remain.
  • LeeGLeeG Eating your first bourne

    So your only using global sites lists?

    No search results from the search engines?

  • Yes i tried disabled all search engines in every running project, to check if it is really related to the scraping part or not.
Sign In or Register to comment.