Skip to content

LOW Threads count in latest update v11.02

Hi ALL

After updating SER to v11.02 yesterday, I am struggling with my threads counts. There are max 100 threads with 27 Projects. I was receiving full 1500 threads from last version.

Latest version is showing false "NO target to post" which I know is not correct.

I don't know why this is happening with me but now I installed v11.01 and everything is fine.

Any ideas ?
«1

Comments

  • steelbonesteelbone Outside of Boston
    edited July 2016
    same here......

    i could be way off here...but seems like its trying to verify way to much.....
  • I tried to swich off the verification but problem was still there. It seems using too much CPU and CPU usages setting automatically decrease the threads counts.


    Try with the 11.01 Version .. It solved thread count issue for me.


  • steelbonesteelbone Outside of Boston
    I went back to 11.01 and cranking away again....
  • I have the same issue with the newest v11.02. @sven.
  • TobyToby Vietnam
    I have same issue. v11.02 use too much CPU and CPU usages.
    It's not happen with 11.01
    How to go back 11.01 ?
  • SvenSven www.GSA-Online.de
    I haven't made much changes to 11.02 really. I can hardly see any difference here as well.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I saw this the other day and thought nothing of it but today one of my servers that usually run at 1800 threads was getting around 200. After seeing this and the update I presumed it was the problem but still took a look at my settings and I had accidently ticked the "Automatically decrease threads on a memory usage above xxx".

    Unticked it and now its back up to 1800 threads with no problem.

    Lesson identified, correlation is not causation :P.

    Not saying its the same problem for the other guys effected but its worth checking.
  • SvenSven www.GSA-Online.de
    strange. I just reviewed all changes + made a profile of it. Not many things I could have messed up really...I couldn't see any difference.
  • notpeoplenotpeople Viet Nam
    I have same issue with 11.03 . GSA use too much CPU @_@

    How to go back 11.01 ?
  • SvenSven www.GSA-Online.de
    Im sure it is not a problem in software but the system. I can not imagine to have made it worse on performance really.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Running 1,800 active threads on multiple copies with the latest version.

    Anyone effected by this problem got team viewer?
  • steelbonesteelbone Outside of Boston
    edited July 2016
    i will test new version in a bit

    I have a ton of gsa's running....and what a difference when i went back to 11.01...i mean crazy difference...400,000 verifed over night from like 30,000 on all my machines...

    all i did was close gsa...go to start menu and grab previous version....

    This also allowed me to run my normal threads as well...with 11.02 i couldnt get past like 200 or my dedi's just froze up half the time...
  • Hi Again,

    I found that my settings need to revived by senior members before I say that things are not working for me.
    I request you guys to have a look at my Setup. With these settings I was getting 1-2 VPM (1000-1500) threads) from last couple of months from scraped Urls. And I am just 3 months old here. I am working with SB from more then a year and I was outsourcing SER links before owning a licence.

    I scrape my own and Filter the scraped links with GSA PI. Then I put that lists in Identified. Then 10 Projects setup according to SERLists Best Practices to write my verified folder.

    I do the same for Failed folder means I always have 20 projects running to filter out raw lists and writing only verified list. I have Identified and Failed sections dedicated for raw lists.

    Project setting for Raw list filtering.
    image

    I don't Tick Continuously try to post on same site for RAW LIST Campaigns.

    Here is my Global Settings.
    50 dedicated proxies from BuyProxies
    image

    Below is setting for my Contextual Tired link projects
    image

    And here is Settings for Kitchen Sink projects.
    image

    I am using Amazon Wordspace for VPS here is more information about that

    I am not using anything as third party captcha solving. I used Captcha tronix, they are quite good but their lower packages are not for so high no of threds. I will plan higher plan later on.

    SO please review my settings and let me know if i missed something.
    Best Regards
    Devender Garg
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I don't understand what you mean by raw lists and stuff, im guessing your raw list is taking your identified URLs and verifying them?

    Do you have anything else running on this server or is it just your SER? If it is just your SER I would untick the two boxes in the second photo "Automatically decrese threads on a CPU use above x" and the memory version, that is probably where your low threads is coming from......

    If this is to verify your identified list you want your "continue to post to failed sites" as you will get a higher number of verifications over all.

    Not sure whats going on with your tiered link building project. Verify 6 per day +/- 50? Seems pointless to me mate, I would change it to submittions and just do 20 a day depending on your list and site age. Untick Web 2.0 as its pointless and slows your rig. There are potential contextuals in videos and social network worth your time though that you are missing out on.

    So many people fuck up on the kitchen sink, I would only have blog comment, guestbook, image comment, and Trackbacks. I used to include directory but I dropped them recently everything else you have selected with that set up is pointless and just eats your resources and slows down your over all system.

    I have never seen a use in exploit, indexer, microblog, pingback, refferer, RSS, URL Shortener, Adult Video or Web 2. To me they just take up system resources for no reason at all.

    Also, wait out before upscaling Captcha Tronix. Do some tests with it. I am doing tests now and im not sure if it is worth the effort, CB is matching it so far.


  • edited July 2016
    Thanks a lot @shaun for solving the low threads issue. I untick the two boxes as I am running only SER and Cb on that server.

    Yes, My verifying RAW list means verifying identified URLs. 

    That website is only a month old and there are only couple of product pages. I just started link building on it and i will increase it soon. Well I changed it verified to submissions per day per URL.

    I tried Checking "continue to post to failed sites" for Identified list but that decreased my VPM after a day i fed Identified list. I thought SER is trying FAILED urls again and again and thats why i am not receving a "NO TARGET TO POST" warning. So I untick that option. What you say about that?

    I am using the same engines in Tier1 and Tier2 KS. I noted them from one of your comment in forum. But In Tier 3 and 4 KS I was using all Non Contextual platform. So if that platform are not useful anywhere then don't need to verify them from my identified lists as well, Right?

    Thanks for your valuable time.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    In my oppinion there is no reason to waste resources verifying or using exploit, indexer, microblog, pingback, refferer, RSS, URL Shortener, Adult Video or SER Web 2.

    To my knowledge the continue to post to failed work by SER remembering what sites that particular project has failed on before and if that site comes round again then it will try post to it. It does not load the full failed folder. But having this ticked will lower LPM because of the way it work but in the long run it gives you more verifiers.
  • @shaunYou are Genius.

    I revised each campaign setting twice and now I am getting max threads and only good engine verified links. Thanks a lot for insights. You saved my tons of resources, time and money.
    Best of luck with your projects.

  • codyjasoncodyjason New York

    shaun  : thank you so much. My problem also solved when I follow your guide
  • @shaun and a thanks from me also.

    I'd always run with the limit CPU and RAM boxes ticked but had started experiencing problems, much much smoother now cheers.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Devender_Garg @codyjason @Johan

    No problem guys :).
  • redraysredrays Las Vegas
    @shaun - more really good info here, thanks for sharing. And running multiple copies of SER at 1800 threads?
    ^:)^
  • 710fla710fla ★ #1 GSA SER VERIFIED LIST serpgrow.com
    @shaun when running GSA at 1800 threads do you use private dedicated, shared, or public proxies?

    I have GSA Proxy Scraper so I was thinking of using public proxies that have a high alive rate when posting to my raw target list after I import it from Scrapebox.

    Because right now I have 50 shared proxies but I can only run GSA at 500 threads.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @redrays no worries mate.

    @710fla I use 50 semi dedicated proxies to run at 1800 threads. Public proxies will slow you down rather than speed you up mate.

    What is limiting you to 50 shared proxies at 500 threads? I know many people still go by the 10 threads per proxy thing. If that's it don't worry about it.

    I actually got pissed off with other people saying run SER at 10 threads per proxy so I researched it a bit and it seems that was the standard advice given on the forum back in the 2013 era when SER was being used to SCRAPE Google.

    Back then you used 10 threads per proxy, ideally with a scraping delay too. This let you scrape for a decent duration on Google. Now people have moved on to scraping with Scrapebox, H Reffer and Gscraper as it is more efficient and these days 10 threads per proxy for Google is too much and will cause you problems.

    When using SER to SUBMIT it can deal with much more than 10 threads per proxy. In my oppinion the only limitations are primarily your hardware, then SER itself.

    Due to SER being 32 bit it seems to crash when running at over 2000 threads, at least for me. That's why I run at 1800 to leave that 200 gap as it activates threads for other things too it seems so I like to have a little buffer so I can set and forget.
  • 710fla710fla ★ #1 GSA SER VERIFIED LIST serpgrow.com
    edited July 2016
    @shaun wow thanks a lot for that insight!

    I bumped my threads up to 1000 threads and seeing how my computer deals with it before bumping it up more. Already seeing a lot more submitted links. Your the man!

    I was looking at this post on BHW and had a question: http://www.blackhatworld.com/seo/gsa-on-money-site-good-or-bad.840745/page-3#post-8885487

    Why do you untick microblog in Options? Just curious to know before I untick it.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @710fla

     Multiple reasons.

    Primarily I look for one of two things from a SER platform. It either has to provide a contextual link (articles, wikis, social networks) or it has to be something that should already be indexed (blog comments, image comments, guestbooks). To my knowledge and my testing showed micro blog is neither.

    In my oppinion it is hard enough to index SER contextuals these days never mind gash little links like micro blogs. So I dont waste the server resources scraping/identifiying or verifying them.

    Secondly when I did run a scrape for them there were very few hits compaired to what I usually go for and the verification rate was crazy low.
  • redraysredrays Las Vegas
    The 10 threads per proxy 'rule of thumb' is complete bullshit. It might have some relevance if you're trying to solve recaptcha, but even then I'm skeptical. I spent a while testing different amounts of proxies and performance got no better with more. Now I just push the threads until my memory usage gets too high.
  • 710fla710fla ★ #1 GSA SER VERIFIED LIST serpgrow.com
    Thanks guys! SER is running at 1200 threads smoothly while I'm in class. Had it at 1400 threads but my desktop crashed.

    I'm getting a lot more verified links per hour!
  • Tim89Tim89 www.expressindexer.solutions
    edited July 2016
    Is running at that high amount of threads worth it nowadays though? There was a time I used to run at 1200 threads but now I run at around 500-600 threads, I build around 150,000 - 200,000 contextual verified links a day per install with no problems at an LPM of around 300 showing from one install right now.

    If you're reaching alot more verifieds and bigger LPM then fair enough but if you're not, try turning the threads down to 500-600 and see if your results change.

    I would imagine you would need to turn up your HTML timeout quite a bit to run those amount of threads efficiently otherwise you could be losing out on backlinks from your lists due to time outs not to mention burning out your proxies quite a bit too for aggressive use also, I've spoken to a couple proxy provider who say they limit the amount each proxy can do per minute or second anyway.. This isn't public knowledge though, it wouldn't be because they are sold as "unlimited usage". I run 100 dedicated proxies per install right now.
  • shaunshaun https://www.youtube.com/ShaunMarrs
    @tim89 My non contextual T3 stuff runs on SER @ 1800 threads and works a treat. Its the only way I can knock out the volume required that far down the chain.

    Anychance you could check how many domains that 200,000 contextuals are spread across? Really curious, I have just over 1,000 do follow contextuals now and about 3,000 no follow contextuals in my verified folder now but I never use the no follow ones.
  • I am also having the same prob. of Low LPM. Even I have send you @Sven a Bug Report. 

    My LPM is ridiculously at 1.70. 

    How can I switch to V11.01?

    Anyone having V11.01 Exe file?
Sign In or Register to comment.