Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GSa getting stuck on 1 thread

edited January 2013 in Need Help
Running a list of trackbacks I scraped and fairly often GSA gets stuck and down to one thread. If I hit stop it doesn't seem to be able to release it. End up having to force quit, then delete the urls manually in the cache files and continue.

Any one else see this?

Tried running lower thread of 100 but still same problem.


  • Ok seems issue is sites in question have very long load time, and even with timeout set to 30 it's not moving on to the next links.  It seems a few long loading urls in a row is causing some lock.
  • SvenSven
    set the options->filter to only download a certain size (1mb should skip most spammed sites).
  • Thanks Sven will give it a shot.
  • Actually just tested link and it's not spammed, just taking 50 sec to load according to Webspeed test, so seems issue is multiple urls from same slow loading site in a row are not being filtered well by the timeout setting (I had it set to 30).

    I had 7 of these in a row.
  • And if I just post these on their own it eventually gets through them, but in a bigger list with more active threads it was causing the slowdown to only 1 thread.
  • edited January 2013

    I’ve got a similar issue. With the latest updates, my GSA SER seems to constantly report a bug, and give the option of closing the program, ignoring etc etc when doing a submission run. This happens after some time. Some times maybe 5 minutes into a run, sometimes shorter/longer, and happens no matter how many threads I run. I’m currently running 300 threads. 

    I chose to ignore the error, and it sometimes pops up with another error immediately after, prompting the same options, ignore/close program etc.

    When I do click to ignore, the program continues to run, but when I do then click the ‘stop’ button, a project or two will never stop, and instead stay active, although they won’t actually submit. They just stay highlighted, and the thread count never gets down to 0, and I’m unable to close the program correctly. I’m forced to click on the ‘x’ in the top right hand corner, which states ‘there are still projects running’, I click ‘yes’, and after some time the program will then close.

    In some cases the program closes correctly, but sometimes a box pops up with the message below:


    Search_Engine_Ranker.exe: MM Operation after uninstall.


    FastMM has detected a FreeMem call after FastMM was uninstalled.




    When I do reopen the program, the data hasn’t been saved for the projects which got stuck, so it hasn’t saved the latest submitted or verified numbers, and is much lower then it was previously. I’m trying to create a certain number of verifications for each project each day, and this is playing havoc with my ability to do this, as each time it crashes, it loses the latest submission/verification data for the projects.

  • ronron
    edited January 2013

    I had it happen to me twice what medway said. On both occasions it was parsing Blogtronix sites, and all of a sudden my threads started going down from 100 to 1, and finally zero. As if it was choking on the websites. I wish I started a log file, but I had a lot of balls up in the air. So I ended up deselecting the Blogtronix platform, and have not had the problem since.

    I have also had a few GSA error messages (unrelated to the problem I just described), but just said continue and everything was fine. However, one time I lost a few submitteds and verifieds.

  • Interesting as mine seem to be stuck on 1 even after an hour or so but maybe it needed time.

    I solved my issue by running the list through Xrumer LDA to check for OK 200 and filtered out slow loading sites.

    Been running for 24 hours now with no problems.
  • BrandonBrandon Reputation Management Pro
    I'm starting to get this problem.  I've tried to narrow it down to one project but it has happened on multiple projects.

    You stop all jobs, threads start dropping but never get to 0 forcing you to ctrl+alt+del.  When you reopen, the data that was last changed has not been saved.
  • edited January 2013
    I've been running good on the same list since I last posted this (yes 24/7 same raw list going through a few million targets to test).

    I think it locked one time but deleted some urls from cache and been fine now for 6 days straight.

    So try my trick to remove slow sites.
Sign In or Register to comment.