Skip to content

Low thread count despite of setting it high

1246

Comments

  • edited June 2013
    no it's no virus @rodol just protected with confuser http://confuser.codeplex.com/ thats no virus at all ..but since it packs and obfuscates the assembly and such stuff some cheap viri software thinks its bad.. get a GOOD virus scanner not some lousy free one with tons false positives ,)
    i got kaspersky !!
  • edited June 2013
    Hmm, very sad I'm back to 8 LPM when I woke up this morning and saw it. :(

    @ron Any idea why had this happened? lol, I guess I could remember people saying that after 12 hours or so, they again get shit LPM.

    The keywords file is already loaded and the proxies are beast too.

    edit: Hmm, it appears that T3 was paused on reaching the daily limit, btw, it should still scrape T2 and T1 links fast with all these keywords, though. Still not convinced this might be the reason though.
  • ronron SERLists.com

    I think you should first review the forum for threads that have to do with high/low LPM. So many have put up screenshots and have been helped. I know the answers are there for you. It's for sure in your settings assuming you only use non-public proxies. 

  • @ron I've both, for scraping I use both private and public (even public proxies are all no more than having speed of at least 1.5), things are pretty fast when tier 3 is enabled (which currently is and I'm having LPM of 38 now).

    Thanks.
  • edited June 2013
    Hmm @ron I'm finding (just now tested by setting tier 3 as inactive) that very less links are actually scraped for tier 1 and 2. The higher LPM was just because of tier 3.

    Do you face the same thing?

    Also it uses 0 threads when tier 3 is inactive, damn making me mad.

    Really kinda unsure.
  • @Ron, Do you use only the 5 random G search engines? Do you use them all 5 at the same time? Do you use any other engines?
  • Thank you KayKay  :)>-
  • ronron SERLists.com

    I use 119 different engines and today was running at 175 LPM. I think most of my gains came from picking efficient engines, and going through that process something like every 3 months or so.

    Yes, of course you have less LPM if you isolate the contextual tiers, or the reverse for the junk tiers. If I isolate the contextual tiers where I have strict limits, I probably average 10 LPM on those. But you should expect that for those projects.

  • @ron Thanks, I thought I was just the only one facing the same issue on upper tiers.

    I'm now experimenting with increasing search engines to 26 or so from 6 and will see if it benefits me.

    Thanks.
  • Also @ron curious if you use the option "Always use keywords to find target sites" ? I keep it unchecked and it hardly uses any keywords to find new targets but just uses footprints most of the times (I think).

    What are your views?

    And it seems lindexed is working great, I get 100% crawl rate and I could see some serp improvements too. :)

    Thanks.
  • ronron SERLists.com
    I have my own keywords, so I never use that.
  • edited June 2013
    @ron Firstly, to avoid the confusion I have, I'm assuming you're checking the option "Always use keywords to find target sites" ?

    I too didn't wanted to use it but what about it finding sites with footprints, especially sites like directory submits, etc which has no keyword searching enabled and the only way to find is by footprints maybe. If we check it, it won't search by it. Should it affect that much? What are your views on it?

    Thanks again.
  • Could proxies affect the rate of submissions on an imported list?  I can't even get above 25 Threads using an imported list, and have my proxies set to Submit and Verify using Privates.  Have had quite a few problems using my proxies, but I rarely get any Download Failed messages - SER just runs slow, period. 
  • @Pratik @Ron - To answer one of Pratik's questions from earlier, I found a program that splits up files automatically. So you could insert a list of 100,000 keywords in a txt file and have it return 100 lists of 1,000 keywords.  Good for everybody, but really good for anyone using Ron's macro method to pull keywords from an external file.  

  • Asking again (sorry),

    @ron Firstly, to avoid the confusion I have, I'm assuming you're checking the option "Always use keywords to find target sites" ?

    I too didn't wanted to use it but what about it finding sites with footprints, especially sites like directory submits, etc which has no keyword searching enabled and the only way to find is by footprints maybe. If we check it, it won't search by it. Should it affect that much? What are your views on it?

    Thanks again.
  • edited June 2013
    Thanks @Ron for the great tips!

    Here's another free tool (on top of Kaykay's) I found to split files instead of doing it manually.



  • ronron SERLists.com
    edited June 2013

    @Pratik, this is where @sven probably should answer the question.

    @Sven, the tick box to "Always use keywords to find target sites":

    1) Can you please explain, if you have this box checked, how and when does SER use this?

    2) And what happens to the keywords you have in SER if that box is not checked - does SER ignore our keywords?

  • I wanna know this also
  • SvenSven www.GSA-Online.de

    @ron

    1) If checked, the program searches for new target sites with one of your keywords in the query. ALWAYS

    2) If not checked it will only use the keywords in a query if that engine requires it by setup (Blog comments, Trackbacks).

  • that means if we want to scrape the most we should thick that box.
  • ronron SERLists.com

    Thanks @Sven.

     

  • @Sven For (1), this isn't always the case. In the logs, I saw it searching for some other queries I think despite of it being checked.

    @ron So what's the best as per you? Leaving that unchecked I assume?

    Thank you.
  • ronron SERLists.com
    I would probably check it. There are some platforms where it just won't make a difference. But on some others it will. I would give it a 1-2 day run, and see if it changes your LPM. My instincts are telling me to check it.
  • don't put your keywords in quotes if you are ticking that.
  • @Ozz made sure they're not, thanks!

    Also I tried with both I believe but cannot spot the difference well, I think it needs extensive monitoring.

    I'll look onto comparing again.

    Also @ron do let us know if you do experiment. :)

    @ozz Do you leave it checked or unchecked?

    Thank you.
  • If you are adding foreign keywords with special characters, do they need to be encoded first?
  • @Meatplow, I use Russian keywords without any encoding and problems.
    Import with UTF-8.
  • @crownvic Great, thanks!
  • AlexRAlexR Cape Town
    @crownvic - I recall that you have quite a few sites and wanted to ask you something about registration. Do you register the sites under different names (like with godaddy) or does this not matter? Or do you use a privacy for registration?
  • @AlexR privacy is really a must thing. I don't like anyone spoofing through my contact details. Also huge increase of spammers these days, needless to say. Many registrars like Name.com even has coupon code for free privacy. ;)
Sign In or Register to comment.