Skip to content

thread/connections problem

edited January 2015 in GSA Email Spider
this is my setup. i am going after website urls not search engines.

url queue limit = 1,000
proxies = 50
threads/connections = 50
only going after emails, not phones or anything else.
i am accepting cookies
i am analyzing the head and body.
my proxies are semi-private proxies.

the problem is the thread is not even close to 50. even though i set it on 50, the thread stays under 20, sometimes its 7 and next its 13 but nowhere near 50. I have had this problem before, i never set a url queue limit but i fixed it by setting the url queue limit to 1,000. but this time it did not fix it. any thoughts or ideas to fix this problem?

TLTR: set thread to 50, but thread stays at under 20.


  • SvenSven
    save the project and send it to me so I might debug and see what it is.
  • @sven sent you a pm with my project file.
  • SvenSven
    had debugged this and one thing seems to be strange. The URLs in queue are all the same!? Did you import a list of URLs?
  • edited January 2015
    yes, i imported 2 urls, only 2 urls. i am parsing everything on the domain, no external sites though. and i am also analyzing JavaScript for protected emails.
  • SvenSven
    hmm so the duplicate urls must have been added by the program you say?
  • edited January 2015
    yes, i only imported 2 urls/domain. i am parsing everything, so maybe it is going after every link even the ones that are similar or duplicates. but as far as what was imported, yes i only imported 2 urls/domain.
  • SvenSven
    Thanks for all. Was able to locate the issue now and fixed it in v7.17.
  • i have updated my gsa email spider to v7.17. unfortunately i am still having this problem. i still have all the same settings on, but this issue is still there.
  • SvenSven
    you still get duplicate urls in the queue?
  • no, i was talking about the thread/connection problem. but i think it is fixed now. i let it work for 1 full day and it looks like the threads are staying at 50 or close to 50. so the problem seems fixed.
  • SvenSven
    yes, guess you still had the old project with the duplicate urls loaded. This however should then be fixed on next project.
Sign In or Register to comment.