Skip to content

How To Increase Number of Threads

AlexRAlexR Cape Town
edited August 2012 in Need Help

I now see there is a thread count in the status bar of the tool. I see that I have anywhere between 10 and 200 threads. Under options I have set threads to 500. I also see that my SER program is only using 0.5gb Ram. 

My question is how do I get it to use more resources (i.e. do more)? How do I get it to use the various threads that I have set (i.e. I'd like to see it using 300 to 400 threads)(apart from the set thread count in the options bar). Is there a setting I am missing that is maybe holding the tool back?



  • SvenSven
    The program can only use your maximum number of threads if there are so many targets to post to or if there are so many sites to verify. Usually this is not the case as search engines only deliver a maximum of 100 results and a verification of your submitted URLs is not happening to all sites at once but in intervals according to there last check.
  • SvenSven
    If you want to use all threads than you should use global site lists only or import target urls from a list. In that case there are enough targets present to feed all threads with data.
  • edited August 2012
    Run a project using imported urls, and it will hit your max thread count.
    Im fairly sure what you are seeing is related to the search engine scraping aspect. Why, i do not know
    EDIT, Sven beat me to it!
  • AlexRAlexR Cape Town
    "The program can only use your maximum number of threads if there are so many targets to post to or if there are so many sites to verify."

    This implies I need to add some more keywords or lists for it to post to? But with 100 projects, and around 1000 keywords each and I am running 100 projects, each with about 1000 keywords so there should be enough to scrape. I have also enabled to use scraped keywords to find more targets. So there should be lots for it to go at! I also have 120 SE's selected.

    Some of my projects are using scraped lists. (about 1 in 4 projects) 

    So from Sven's answer it seems that many of my projects are exhausted (not enough results to go at) which is why I'd have the low thread count, but they are all showing as active and I started with 1000keywords and it should scrape new keywords from sites. 

    Is my understanding correct or am I missing something?
  • 120 SEs can't be good imo. In my experience many of them will give you just Page1 results or no/crappy results at all. Some of them are slow, too. Do you and us a favour and test at least all international SE. Test them with a simple keyword like "powered by vbulletin" in the Search-Online-URL-Tool one by one. Write down the results of each SE and let us know :)
  • AlexRAlexR Cape Town
    @Ozz - I am currently on holiday in Greece. :-) So can do a very limited testing. I just select all with English language (right click and select). 

    @Ozz - so you're saying that with 100 projects and my settings somehow it's running of things to do, hence the low thread count?
  • OzzOzz
    edited August 2012
    I've tested them a few days ago but you should test them on your own machine, because there were some changes in some SE and results can differ from each one.

    I don't know if this helps to increase your threads. I just suppose you have some not so good SE in your list.
  • Why is it that SER cannot use more threads to search? 

    Say I have 20 projects running and the thread count set to 200, while all projects are searching, the thread count is somewhere around 20. But why cant SER use more threads for searching, so divy up the other 180 unused threads among the 20 projects?
  • SvenSven
    because if the program would use so many threads at once, it would get banned by the search engines.
  • what if i am using a ton of proxies, then could the program run more threads while searching? 
  • You can try to lower the search query time first (Options -> Submissions -> Use proxies for... everything -> [1-60]s). The more proxies you use and different SEs selected, the lower you can set the query time. Standard and safe query time by GSA is 60 seconds.
  • Though it is possible to scrape at a faster rate than the maximum allowed by GSA SER without getting banned by certain engines (as attested by other scrape tools), there is no point in implementing that with the built in scrape and post mode of GSA SER

    The whole point of using it in this mode is to continuously build links. There are a finite amount of link targets to be found at any given time - eventually (and this could be a long time) you will only find new links for a given project when a SE updates its index. Risking IP bans and all sorts of hassle for the sake of decreasing this time makes no sense what so ever for a continuous use program.

    If you want to use GSA SER to build lots of backlinks in a short period of time, use the built in 'import targets' mode, and feed it targets by either from your master site lists, by running your second allowed copy on another machine using either of the 'search online' methods, by purchasing lists, or using a 3rd party fast scraper.
  • I have already set the SE wait time to 5 seconds. I am using 2 licenses on 4 VPS. With private proxies. One of those machines is continuously scraping for new targets. 

    What i am saying is for more advanced users, the option of having SER use more threads while searching. I understand using the Global sites lists, and I do use it when starting a campaign, but when all my campaigns have run through this list, wouldn't it make sense to allow SER to use more threads while searching to build more targets? Even on the machine that is strictly scraping, it still only uses a couple of threads...

    Maybe like another option in the 'Options' menu to allow SER to use more threads while searching, but only if the user is using a defined amount of proxies?

    I am not trying to beat a dead horse here, This software is already extremely powerful and user centered (unlike a lot of the other programs out...). But I use this as an enterprise class tool with the company I work for, and the CEO hates seeing that the software isn't "always" running at full potential.

  • OzzOzz
    edited August 2012
    Not a solution, just an idea.

    What if you split your 4 VPS in 2 "Scrape only" with 1s query time to feed the lists and 2 for "Post Only" with global lists?
    The global lists are stored in one shared folder where both "Post only" VPS get their URLs from. You can vary that to 2 "Post and Scrape" + 2 "Post only site lists", too. 
    What you need is a lot of proxies, I suppose you have, and a good diversity of SEs.

    In the end it is all about how many submission you make and I think that this could help to begin with.
  • I have a similar setup, just be careful you dont have more than one copy WRITING to the same folder. Reading is fine, but writing to the same file across a network will cause conflicts

    I have one copy setup to just import and sort to sitelists, which i feed it using SB. If its not doing anything (eg SB is still busy and GSA has finished sorting its current list) i let it search for sitelists/search online (using SE,s that SB cant use).

    Ultimately GSA engine is capable of replacing SB, especially with its extended range of SEs, but not in its current form.
    As mentioned before, i see hiving the search online functions off into a dedicated scraper that can run independent of GSA SER is probably the way forward
  • AlexRAlexR Cape Town
    Another possible solution would be to have a slider where 0 = only scrape. 10 = only post. Then it allocates threads according to the priority you set. I.e. more scraping or more posting. 

  • AlexRAlexR Cape Town
    I am checking my setup at the moment and see that it is running between 6 & 10 threads on average. 

    I have many projects and many keywords and some private proxies and many SE's. So there must be targets that it can find and different SE's it can run at different times while scraping. If we assume 50 projects, 50 SE's per project, 1000 keywords, 10 private proxies...well that's a lot of combinations! Surely it should have more than 10 threads open? It's currently running less than 1 thread per proxy!

    I am not using global lists at the moment as I need to first generate some. So it seems that to get the best of this tool it needs to be used in conjunction with SB and using global lists...

    I don't think it's run out of keywords or not sure what else it can be?
  • does the program keep track of search engine queries it has done before
    even after closing and opening ? (after updating)
    just curious
  • SvenSven
    No, but as the queue is always different on start (random searche gnine + keyword + engine footprint) it doesn't matter that much.
  • AlexRAlexR Cape Town
    Is there anything else I can do to increase threads? 

    How many threads are most people getting for scraping?
  • Something you could try is to create a seperate project for each platform you are submitting to.

    So for example if you have a project that is submitting to blog comments, forum profiles, image comments, etc... create a group and set up a separate project for each one. One for blog comments, one for forum profiles, one for images comments, etc.. (be a bit cumbersome if you have a lot of projects already...).

    This is what i am currently doing until I can find a solution for not using but a handful of threads while searching.
  • For the most work, I get a count far below my limits. For harvesting most times only a few (with 2 projects). Would be interested to know how multithreading (beside identifying, scraping..) could used better.
  • In regards to the question about what other users are getting for thread counts, currently I have 15 projects searching for urls and only using 5 threads.... =((
  • AlexRAlexR Cape Town
    Agreed - I'd like to also improve the thread count for the projects. Just seems that the resources/program are not running at full capacity...wondering if it's my settings. 

    It seems the only way to get it to full capacity is by importing lists or posting to global lists. Is there another way to use the program at it's max?
  • s4nt0ss4nt0s Houston, Texas
    @globalgoogler - Have you adjust the search time to wait between search engine queries? (for scraping)

    This can definitely speed the software up but you of course need some good proxies.
  • This helps but even at very low values not many threads are used.
  • AlexRAlexR Cape Town
    150s timeout. 
    350 threads (it never gets close to this...normally is around 10)
    2s search time. I had it at 4 and at 3, so now trying 2. 
    Also - I have set website size to download to 2mb, down from 4mb to try and get things moving. 

    Still only using about 10 threads...and have many many projects and many many keywords, so there is a lot for it to go at...
  • AlexRAlexR Cape Town
    Just checked now and it's running at 3 threads!
  • @ GlobalGoogler - What is your system specs to handle 350 threads?

  • AlexRAlexR Cape Town
    @Sinex - I set it that high so it doesn't throttle it down. It's never got close to that...normally runs around 200 when I use lists, but without lists it sits at around 10. Running a 2GB ram VPS. Basically, you need site lists to max out the program it seems. 
Sign In or Register to comment.