Skip to content

Low thread count despite of setting it high

Hello,

In the start the threads were quite maintained around what I set, but not now. For example, if I set 300, they stay around 11 or in extreme rare cases they go above 250 or so.

What maybe the reason for this?

Thanks.
Tagged:
«13456

Comments

  • edited June 2013
    I'm having this problem also.  They'll run at my maximum when verifying, but when going through the search engines I'm in the 4-11 thread range.  Don't understand at all.  My proxies aren't failing and I keep seeing [end of results] message in the log, which I believe indicates SER is scanning the search engine pages.  It's killing me right now, my Lpm is under 0.02 on my Tier 1 with 125,000 new keywords, 250 threads, and 7 primary engines selected.  
  • AlexRAlexR Cape Town
    Check this thread out. Having a similar issue:
  • spunko2010spunko2010 Isle of Man
    It happens to me every so often. I just reboot and it goes away.
  • edited June 2013
    @spunko2010

    I too noticed it that after I stop it and restart, it utilizes full resources. But that too lasts no longer than 5-10 minutes or so.

    So how convenient is that?

    Can @Sven look and see if something could be added/improved to kinda fix this?

    edit: Tested again by stopping and starting again and now getting LPM of around 38-40, nice. It always happens so and after few minutes it'll go back to the slow stage again.

    Can't something be done for this? Or is there any way to stop the projects and restart them every 15 minutes or so?

    @AlexR yes I did saw that.

    Thank you.
  • spunko2010spunko2010 Isle of Man
    Pratik are you using a server or your PC? Try running a virus scan if you are using a server. I don't know if it worked for me permanently but since I removed some trojan it has been working great....
  • @spunko2010 It runs on my VPS. Well the VPS only has 2-3 programs including GSA SER and CB and nothing else, empty space you can say.

    So unsure how it could be a virus thing.

    But maybe I'll try to get malwarebytes on it and check what I find.

    So after catching the virus and cleaning it, does it run for you at constant speed? What is the LPM that you're getting?

    Also a point worth to note that I'm scraping using keywords and not using from global sites list, so what's your config? Scraping or using from global sites list?

    Thanks.
  • edited June 2013
    Unsure how this works but from few discussions I read, I deleted target url cache and it seems to run fine currently, not sure how long will it run like this though.

    People pointed that it generally don't last longer than 12-24 hours, however, that's also quite a bonus for me.

    Getting LPM of 38 or so.

    What target url cache deletion actually does @Sven or anyone? I assume it forgets where it was or something else?

    Thank you.

    edit: Am now at 27.50 LPM. It was failing down and not making any links. I did some changes to it. I'm just running 1 project for now, so on tier3 I set the options to never do the verification (never needed anyway, maybe can do all once in a day or maybe at the interval of 3 days or so) as I don't have tier 4 going so theres actually no need to do verification for tier 3.

    For tier 1 and tier 2, I set to do verification at every 12 and 6 hours respectively (instead of 60 minutes for both, previously).

    Let's see how far this actually lasts.

    edit 2: lol, things are seeming to go down again, sad.

    Hope there's something that could work.

    Maybe experts like @Ron could help.

    Thanks again.
  • ronron SERLists.com
    edited June 2013

    To help make it go away, I would disable verification on all junk tier projects like T1A, T2A, etc. You can always run verification on those tiers over the weekend.

    Have verification on contextual tiers like T1, T2 set for automatic every 1440.

    The other thing I would do is come up with a very large list of keywords, and break them up into smaller files, and use a spin folder macro in the keyword field. I have tested having this folder on my desktop vs. Dropbox, and it is noticeably faster if the folder is in Dropbox.

    Forget global site list, and do not scrape keywords using SER. Scour the internet and find a big-ass list of the most common words/phrases/searchterms etc. I mean 100,000 for starters, but you really want 1,000,000 or more ultimately. Put them in files of about 1000. And then have 1000 files in that folder, and you now have 1,000,000. You need to feed the beast to keep it going fast. The more 1-word and 2-word terms you have, the more new targets you will find.

    Then look at your advanced tools, and see what the engine ratios are for verified vs. submitted. Kill the inefficient engines.

    Always set up dummy projects where you run the engines you think are bad so that you always have data on all engines. Sometimes Sven changes an engine, and suddenly it becomes a performer. Maybe because he tweeked the script. So never completely discard an engine because today it is failing you. Always compile data on the "bad" engines because some do change for the better.

     

  • edited June 2013
    ron good suggestion.
    but how to use macro for keyword?
    and
    are you also using macro for articles and how frequently you change your articles?
    and what is your custom time to between search engines? because still i am facing ip block problem.

    Thanks again for helping us.
  • @Ron Thanks for the great tips. However, what projects did you mean by "Junk", not very sure as to what are T1A and T2A projects.

    I've verification running for T1 and T2 at every 6 and 12 hours respectively. Maybe I should set both to 24 hours?

    Also, I've around 4K keywords or so now, which I now think are very low.

    The main problem is search engines banning the IP, I've around good high speed private (about 25) and public proxies too, but still. I've selected both in the search engines scraping option.

    Also, can't I have 1,00,000 keywords separated by comma? Didn't quite got the folder idea and very unsure about it.

    Thanks a lot for the help!
  • ronron SERLists.com

    Go here to understand contextual (T1,T2,T3) vs. junk (T1A,T2A,T3A): https://forum.gsa-online.de/discussion/2930/ser-tiers/p1 - Hopefully you are not mixing junk links with good contextual links.

    I would put the T1 and T2 at automatic 1440 minutes. You only need to verify those once per day. Otherwise you slow down the posting.

    If you stick in a million keywords in SER, you will definitely slow it down. You don't want to throw extra weight on the shoulders of the program. By using a macro (look at the help button in SER), you can direct the keyword field to look at a folder stuffed full of keyword files. It makes things go much faster.

    I never use public proxies for anything. Terrible for performance.

    Try to make sure you don't have filters for PR or OBL as that will slow things down terribly.

    And for articles I use KM spins with the anchor text tokens embedded in the spins. I would change them every month or two, probably two.

  • edited June 2013
    Edit: nevermind
  • @pratik - for the folder idea you would scrape keywords using a tool like Scrapebox, filter them down to only 1-2 word phrases using a tool like Market Samurai (not sure what anyone else uses), then create lists of 1,000 words each.  Then you can import those into GSA and let it find URL targets using those keyword lists.  Time consuming but very effective.
  • AlexRAlexR Cape Town
    @ron - have you done any testing of niche keywords versus general keywords and the impact? 
  • edited June 2013
    @Ron Thanks again. I saw your thread just now, interesting read. Well, I too do like most, but seeing your guide, I've now enabled Article and Directory listing to be enabled in Tier 2 (not doing kitchen sink thing right now though, but might set it up very soon, kudos for the tip!) as well, might do so for Tier 3 too maybe in near future/soon.

    About the macro thing, still not very clear from the documentation too (sorry, just tried to skim read content) but it'd be great if you could give a nice example or point to an already existing thread or so (search of this forum basically doesn't pull the results as I want, sad).

    Should I use the option right click on keywords field -> Choose random (from a folder)? That'd fetch keywords from all the .txt files in the selected folders I assume?

    Or is it this? - https://forum.gsa-online.de/discussion/comment/19430/#Comment_19430 . But I think this is just for single file, so what should be the correct way? An example would be really great.

    Also for tier3 I actually had verification set at specific interval but now that I've disabled it and have the option Links Per Day (for total verification's) set for it, I think there'd be no way for it to count and pause at that point as I've disabled the verification? Or for T3, I shouldn't care anyway with limiting my links per day?

    I'm also quite worried about setting verification to 1440 minutes for T1 and T2 and I've set link limit per day for it too but I think it should be, maybe fine. What are your views for this?

    Your and everybody's replies are much appreciated.

    Also anybody looking to spare some standard keywords list that maybe posted here? :P

    I've scrapebox and I do scrap related keywords though.

    Thank you.


  • ronron SERLists.com
    edited June 2013

    You stick this in the keyword field:

    %spinfolder-C:\Users\Administrator\Dropbox\keyword-folder%

    Note: The path above is the path to where my keyword files are. Obviously, your path may not be Administrator, you may not have Dropbox and instead use a folder on your desktop, etc. In that folder, you maybe create 1,000 different files (in .txt vertical format, meaning no commas or pipes, just a vertical list) with 1,000 different keywords in each file. SER will randomly select different files in that folder, and you will never run out of targets.

    @sven just mentioned in another thread that a good way to get SER to not overshoot your daily linkbuilding target is to have more frequent verification. So I won't tell you to not follow Sven's advice on that. My opinion is to only do that on the T1 project. Who cares if you overshoot on a T2 or T3, and again, who cares of you overshoot on the junk tiers T1A,T2A,T3A. It just doesn't matter because NONE of those link directly to your moneysite. I have verification disabled on junk tiers, and set to automatic 1440 minutes on the contextual tiers T1, T2, T3 - but that's what works for me.

    Scrapebox gives you too many longtail search terms. Start searching the internet for one word and two word lists. Hell, if you think about it, a dictionary is one big ass one word list. That should give you some ideas.

  • hmmm nice will try too @ron is there the ability to add two spinfolders @Sven ?

    i would like to use one folder with niche specific keywords and another one with best search keywords at once is this possible ?
  • @kaykay you could just mix the niche lists in with the general lists.  If you're trying to determine which delivers more links you may have to test by changing the folder path every 12 hours or something. 
  • here is a 109,582 one single english word list http://www.mediafire.com/?h97c8ciiwb6675z

    Happy Scraping,
  • edited June 2013
    ron

    %spinfolder-C:\Users\omi\Desktop\Dropbox\GSA\Keyword list% (keyword list is my folder name and have 81 files named 1,2,3,......81 containing 1k keyword each file)

    but when i click on test button to sure that it pulling kw from folder then i see that it pull thousands of kw each time when i click test again.
    so what i am do wrong (sorry bro don't know much macro)
  • @baba Make sure each line contains one keyword only.
  • audioguy my keywords look like this..and i think this is right.

    medicare gov
    microsoft windows
    portland or
    poway california
    san antonio tx
    transmissions
  • @baba Same here. Just tested this. That is how spinfolder macro works, I guess. I usually use that macro in the article content field. Each file contains the whole content for one article.

    Perhaps @ron can explain. Will SER take the list randomly and go through it one by one or pick one random file every time it needs to hit the search engine.
  • @baba How did you split into 1K kws each file? Manually? Also are you using those dictionary kws provided by rodol?

    Also @rodol thanks.
  • ronron SERLists.com

    I can't answer the mechanics of how it grabs the keywords within the file. I just know that it grabs a random file from the folder each time.

    My assumption (if you are using scheduler) is that it will grab a different file each time the project is turned on by the scheduler. But once it grabs the file, I would think it would start from the top - but I could be wrong on that. 

  • @ron I'm sorry for too much questions haha (I really am) but is the splitting 100K keywords into 1K each is going to be a manual work or there's some other tool which could do the job too?

    Thank you.
  • ronron SERLists.com
    @Pratik, it is manual. But 1-2 hours of mindless work, and you are set, and never have to screw around with keywords. So it is worth it!
  • @ron Definitely, thanks, will get to work now.

    Also one thing I never got to know is that how on deleting the "target URL cache" makes GSA speed up like beast for a while. Hmm.

    Any ideas?

    I'm still back to old speed despite of the changes, but hopefully this will change up when I add more keywords in macro format.

    Thanks!
  • ronron SERLists.com
    edited June 2013

    I never do that. I think it sounds great in theory, and have tried it, but I have never witnessed higher speed after I did it. I think some people did that to decrease Ram memory if they were having freezes, but not to gain speed.

    Speed doesn't come all at once. In comes in steps like a staircase. You experiment, you figure something out, it goes up. Then you stay there until you experiment with other stuff.

    Things that help are removing inefficient footprints from engine files, and constantly assessing verified-to-submit ratios on engines, and then making cuts or additions based on performance.

    Then you also test other stuff like your settings, which boxes you check, which ones you don't, etc. You literally need to think about what SER is doing with each little box you check - i.e., does checking that little box mean that SER will have to work harder to find links, etc., etc.

    When you get to the point that you have tested everything, then I guarantee that you will have terrific speed.

  • edited June 2013
    @ron nice post. For me, however, that did increased post speed (hence LPM) for a while. Also I think as you already have good LPM going, you might not have seen any benefit. But for low players like us, it does something for a while.

    Also I keep on configuring various options here and there and try to see if anything changes.

    Curious as to how many search engines are you using currently? I use 5-6 currently.

    Also what are your views on building on links continuously? Do you ever stop any projects once you've achieved the desired ranking or keep building (even if few links per day) them? I think stopping the project entirely does increase chances to get penalized or loose rankings maybe?

    Thanks.
Sign In or Register to comment.