LOW Threads count in latest update v11.02

Hi ALL
After updating SER to v11.02 yesterday, I am struggling with my threads counts. There are max 100 threads with 27 Projects. I was receiving full 1500 threads from last version.
Latest version is showing false "NO target to post" which I know is not correct.
I don't know why this is happening with me but now I installed v11.01 and everything is fine.
Any ideas ?
Comments
i could be way off here...but seems like its trying to verify way to much.....
Anyone effected by this problem got team viewer?
I have a ton of gsa's running....and what a difference when i went back to 11.01...i mean crazy difference...400,000 verifed over night from like 30,000 on all my machines...
all i did was close gsa...go to start menu and grab previous version....
This also allowed me to run my normal threads as well...with 11.02 i couldnt get past like 200 or my dedi's just froze up half the time...
To my knowledge the continue to post to failed work by SER remembering what sites that particular project has failed on before and if that site comes round again then it will try post to it. It does not load the full failed folder. But having this ticked will lower LPM because of the way it work but in the long run it gives you more verifiers.
I'd always run with the limit CPU and RAM boxes ticked but had started experiencing problems, much much smoother now cheers.
^:)^
I have GSA Proxy Scraper so I was thinking of using public proxies that have a high alive rate when posting to my raw target list after I import it from Scrapebox.
Because right now I have 50 shared proxies but I can only run GSA at 500 threads.
@710fla I use 50 semi dedicated proxies to run at 1800 threads. Public proxies will slow you down rather than speed you up mate.
What is limiting you to 50 shared proxies at 500 threads? I know many people still go by the 10 threads per proxy thing. If that's it don't worry about it.
I actually got pissed off with other people saying run SER at 10 threads per proxy so I researched it a bit and it seems that was the standard advice given on the forum back in the 2013 era when SER was being used to SCRAPE Google.
Back then you used 10 threads per proxy, ideally with a scraping delay too. This let you scrape for a decent duration on Google. Now people have moved on to scraping with Scrapebox, H Reffer and Gscraper as it is more efficient and these days 10 threads per proxy for Google is too much and will cause you problems.
When using SER to SUBMIT it can deal with much more than 10 threads per proxy. In my oppinion the only limitations are primarily your hardware, then SER itself.
Due to SER being 32 bit it seems to crash when running at over 2000 threads, at least for me. That's why I run at 1800 to leave that 200 gap as it activates threads for other things too it seems so I like to have a little buffer so I can set and forget.
I bumped my threads up to 1000 threads and seeing how my computer deals with it before bumping it up more. Already seeing a lot more submitted links. Your the man!
I was looking at this post on BHW and had a question: http://www.blackhatworld.com/seo/gsa-on-money-site-good-or-bad.840745/page-3#post-8885487
Why do you untick microblog in Options? Just curious to know before I untick it.
I'm getting a lot more verified links per hour!
If you're reaching alot more verifieds and bigger LPM then fair enough but if you're not, try turning the threads down to 500-600 and see if your results change.
I would imagine you would need to turn up your HTML timeout quite a bit to run those amount of threads efficiently otherwise you could be losing out on backlinks from your lists due to time outs not to mention burning out your proxies quite a bit too for aggressive use also, I've spoken to a couple proxy provider who say they limit the amount each proxy can do per minute or second anyway.. This isn't public knowledge though, it wouldn't be because they are sold as "unlimited usage". I run 100 dedicated proxies per install right now.