Skip to content

GSA works fine, but eventually gets very slow and never recovers

245

Comments

  • edited February 2013
    As the issue seems to be keyword related according to @sven do the people with no issues mind sharing how they generate keywords for low tiers so I can get a temp workaround?

    I've used a generic 100k keyword list, generic 2k keyword list

    Scrapes 2k keyword list from scrapebox ie if my niche is blue widgets I scrape for blue widgets then take the output and rescrape again ie 2k lsi keywords for blue widgets.

    All seems to cause the bug.

    @ron how do u scrape?
  • I am having the same problem.
    I am using the latest version (5.17), I've started 3 new projects today, and after a few minutes the thread count is 0, and there are no submissions at all...

    What should I do?
  • edited February 2013
    I still have the same problem runs for a few minutes then has stopped.  don't worry super @sven is on it a fix will be rolled out soon i'm sure.

    you can also delete your url history this fixes it temporarily for me.
  • edited February 2013
    If I understands how GSA correctly. Sven said that keywords are mainly used in search for blogs to comment. So if you don't have "always use keywords to find target sites" ticked. GSA will only use footprints like "powered by wordpress"; "leave a comment" to find sites. Now if you use google on GSA. you will find 1000 sites for each of these footprints. Yes new sites are created everyday for us to comment on, but the top 1000 sites position does not vary much everyday. My guess is if you are lucky you get 100/1000 new sites for every footprint without a keywords. This is one of the reason I think why GSA eventually slows. Of course you can always use keywords to find new sites. Which I think is better if you have the "right" keywords.

    Any comment is appreciated. I am having the same slow down problem as well.
  • Not sure on this I have changed that setting twice to test changed my keywords it's effecting all my projects still.

    Anyone have any luck fixing it?
  • It's also happening to me, threads go down to like 0 or 1 and my lpm right now is 0.08  :(
  • OzzOzz
    edited February 2013
    >So if you don't have "always use keywords to find target sites" ticked. GSA will only use footprints like "powered by wordpress"; "leave a comment" to find sites.

    that is not true and just depends on the engine.

    every engine with 'Use keyword for search=1' in its script like "General Blogs" will use keywords for search even if "use always keywords for search" is unchecked.
    other engines are scripted in a way that they never use keywords for search when it makes absolutely no sense, because you won't find many or any target URLs with the combination of footprint + keyword.
    and there are engines that add a keyword for search from time to time when it could make sense.
  • edited February 2013
    @ ozz yes that is basically what I meant, for blogs GSA always uses keywords to search. For others, that is not the case if you dont have that option ticked. 

    Actually I think it makes more sense to use keywords to look for all sites. The purpose of this is not to find relevant sites, rather to find more sites. Think about it. Take indexer for example. If we only use the default internal footprints without keywords. For each footprint theoretically, it should return 1000 sites. But if we add keywords to that specific footprint. It can be anything. We should get more than 1000 sites. Because google will only return top 1000 results even though there might be millions of results returned from google. By adding keywords to footprints, we should be able to get more sites returned.
     
  • OzzOzz
    edited February 2013
    no, you are wrong about this.

    just compare these two examples:

    and now think about how specific your keywords are for searching?! if most of your keywords are specific as "german sheppard dog training" or of a niche that isn't that popular than you will end up with no results at all.

    your thinking might be true if
    a) your keyword is a broad term like "dog training" and
    b) your niche is popular/very well known

    the 'article script' engines uses the command "add keyword to search=2" in the script so it adds keywords to searches from time to time.

    Edit: what could make more sense though if there were an option along to "always use keywords" that it leaves the "" and just put the raw keyword to the search. then you will get more search results even for specific keywords, but it could happen that they aren't related to your keyword at all.
  • I think some of you guys are missing the point here, its slows down the threads but most projects are still submiting, only 1 or 2 are actually scraping, and we are using global lists (at least me) so even if it ran out of the kw or url lists, it will always have the global to blast on, specially when i have unchecked "avoid posting on the same domain twice".
  • ronron SERLists.com

    I have great days (100k) and mediocre days (40k) and yet I have changed nothing. My biggest issue is consistent internet speed. I would start looking at that and not take the word of the service provider.

  • @pisco  i completely agree.  there is something seriously wrong with the software i've been using it for over 2 months and i've spent last 4 days trying to isolate the problem to no avail.

    i think we must have something unique in our settings or database etc which is causing a problem that no one else sees.

    the only way i can get it to run at normal thread speed is if import a site list.
  • @ron i try to monitor the thread usage vs links / Lpm as as you correctly say there is a lot of factors for link building speed e.g. internet speed, keywords, number of proxies, private / public etc...

    but in all cases if the thread count is CONSISTENTLY at 0-2 threads for hours there is something not related to internet speed i would think...
  • @ron what may also be worthwhile monitor is check to see each individual project to see which ones are actually submitting.  when i was debugging i can still hit 50 lpm when 1 project is submitting really well but when I dug deeper i found that the 7 other projects weren't submitting at all for over 4-5 hours (their thread count was really low i guess).

    so i found i fluctuated between 50-100 lpm because if all projects were submitting that was great but a lotof the time it was 1 project submitting super fast for say 3 hours and the others not submitting at all.
  • OzzOzz
    edited February 2013
    what about your proxies? are you sure they aren't banned from searching? do your logs showing any results or just "000/000"?

    if you guys only getting target URLs from your site list with cleared history than it could be because your SEs don't give you results as most of your proxies are banned from them.
  • i've checked that already using scrapebox and GS proxy checker and mine are fine...

    when i turn off proxies it also crawls down to 1 thred only when running it manually.
  • ronron SERLists.com
    Then try using the scheduler because at least you will rotate between projects that way.
  • Going around in circles here, can't be internet speed or proxies (i test them when it happens also), otherwise clearing target url history would not work either.

    But one thing i would like to understand is the following, when the application submits to all my general lists, having unchecked "avoid posting on the same domain twice", it should "restart" and continue to submit to the list endlessly right ?
  • ronron SERLists.com
    I will give this one last 'trick' I do. Whenever I see things slow down, I click on the Active (P) projects to turn them to inactive. And then my speed almost always goes up. Don't ask me why. It just happens. 
  • @ron we are using the scheduler but i don't think it solves it there seems to be a bug where one project is "running" but only uses minimal threads like 1-5.

    i'm pulling my hair out been spending 2-3 hours a day debugging changing settings around to no avail.

    @sven and suggestions? i've sent u all my logs and done a video dump too.
  • ronron SERLists.com

    @pisco - I just reread your OP. I had to stop using SEO Indexer at the same time I was using SER. It just completely slowed down SER. I first brought this up about a month ago, and has since been documented by many here at the forum.

    It is best to import your links into SEO Indexer and let it run in its own space and time. I never run them together. You should let SER run by itself and let us know how it goes.

  • ronron SERLists.com

    And I just thought of another thing...

    Did you guys go through the task of assessing your posting for each engine by comparing your numbers for identified vs submitted vs verified?

    The thing is if you are attempting to post to a bunch of engines (like web 2.0's) that are extremely ineffective for posting (due to recaptcha, changes in site layout, etc.), you will be pissing into a fan.

    You could be going through stretches where SER is hitting a large patch of engines that will never yield any links. And if you have a bunch of those engines in there, that could easily explain that you have good stretches, then bad stretches. 

     

  • I understand what you say, and in all honesty i havent done that final tune on lpm yet, mainly because i'm still fighting with this one, as i would like to have a larger global list to make sure i don't "kill" any engine that can provide value.

    That would explain good and bad days, the problem is we have a good day after clearing url history, and then all other days get progressivly worse, it doesnt bounce up and down, when it goes bad it stays bad.

    About SEO indexer, yes i notice that when they run both (and seo indexer is doing its thing) it eats up a lot of cpu that is usefull and has impact, but when performance is sluggish very few links are built (talking about sub 1lpm here) so it isnt really working, but i did try to leave it off for a few hours to see if i could see recovery, but again nothing.
  • LeeGLeeG Eating your first bourne

    A little trick that dont involve the nuclear science degree approach and time wasted pissing in the wind with data sheets, bio warfare schematics and zombies.

    Open up a project

    Open up cb

    Choose engines cb can do the captchas on

    Copy engine selection to all projects

    Simple, init

  • edited February 2013
    Yes it is (assuming one has CB, that i don't have yet), and thnx for it, but what does it has to do with the problem at hand ?
  • LeeGLeeG Eating your first bourne

    Ron mentioned two posts above mine about setting up your ser

    I was just pointing out the easy way

     

  • can you put the video here
  • Ok so as I see we all have the same issue. It starts out fine and then drops. Mine went from 48lpm to 4lpm. This only start happening the last couple updates.  I think the last three. Also my proxies are fine and not banned.
  • ronron SERLists.com

    @pisco - Seriously, you need to do what I said - and I got it from @LeeG. It is critical. You can tweak all you want and it won't make a difference.

    If you are posting to engines that have recaptcha or changed their signup page, and you have a string of searches searching that footprint, your LPM will start to crash because it simply cannot post to those engines.

    Just go to Options>Advanced>Tools>Show Stats>Identified, Submitted, Verified>Save

    Take all three of those dumps and stick them in excel, column A,B,C. You will have to tweak vertical spacing to get everything to line up nicely.

    Then you will see stupid stuff like one engine has 100,000 identified sites, 50,000 submissions and 10 verified links. Then it will hit you like a ton of bricks that you are wasting your time with a bunch of engines.

    @LeeG gave you another way to figure it out. They both lead you down the same path. The first way I mentioned will truly help you understand SER better because you see the results behind the curtain.

Sign In or Register to comment.