no it's no virus @rodol just protected with confuser http://confuser.codeplex.com/ thats no virus at all ..but since it packs and obfuscates the assembly and such stuff some cheap viri software thinks its bad.. get a GOOD virus scanner not some lousy free one with tons false positives ,) i got kaspersky !!
Hmm, very sad I'm back to 8 LPM when I woke up this morning and saw it.
@ron Any idea why had this happened? lol, I guess I could remember people saying that after 12 hours or so, they again get shit LPM.
The keywords file is already loaded and the proxies are beast too.
edit: Hmm, it appears that T3 was paused on reaching the daily limit, btw, it should still scrape T2 and T1 links fast with all these keywords, though. Still not convinced this might be the reason though.
I think you should first review the forum for threads that have to do with high/low LPM. So many have put up screenshots and have been helped. I know the answers are there for you. It's for sure in your settings assuming you only use non-public proxies.
@ron I've both, for scraping I use both private and public (even public proxies are all no more than having speed of at least 1.5), things are pretty fast when tier 3 is enabled (which currently is and I'm having LPM of 38 now).
Hmm @ron I'm finding (just now tested by setting tier 3 as inactive) that very less links are actually scraped for tier 1 and 2. The higher LPM was just because of tier 3.
Do you face the same thing?
Also it uses 0 threads when tier 3 is inactive, damn making me mad.
I use 119 different engines and today was running at 175 LPM. I think most of my gains came from picking efficient engines, and going through that process something like every 3 months or so.
Yes, of course you have less LPM if you isolate the contextual tiers, or the reverse for the junk tiers. If I isolate the contextual tiers where I have strict limits, I probably average 10 LPM on those. But you should expect that for those projects.
Also @ron curious if you use the option "Always use keywords to find target sites" ? I keep it unchecked and it hardly uses any keywords to find new targets but just uses footprints most of the times (I think).
What are your views?
And it seems lindexed is working great, I get 100% crawl rate and I could see some serp improvements too.
@ron Firstly, to avoid the confusion I have, I'm assuming you're checking the option "Always use keywords to find target sites" ?
I too didn't wanted to use it but what about it finding sites with footprints, especially sites like directory submits, etc which has no keyword searching enabled and the only way to find is by footprints maybe. If we check it, it won't search by it. Should it affect that much? What are your views on it?
Could proxies affect the rate of submissions on an imported list? I can't even get above 25 Threads using an imported list, and have my proxies set to Submit and Verify using Privates. Have had quite a few problems using my proxies, but I rarely get any Download Failed messages - SER just runs slow, period.
@Pratik@Ron - To answer one of Pratik's questions from earlier, I found a program that splits up files automatically. So you could insert a list of 100,000 keywords in a txt file and have it return 100 lists of 1,000 keywords. Good for everybody, but really good for anyone using Ron's macro method to pull keywords from an external file.
@ron Firstly, to avoid the confusion I have, I'm assuming you're checking the option "Always use keywords to find target sites" ?
I too didn't wanted to use it but what about it finding sites with footprints, especially sites like directory submits, etc which has no keyword searching enabled and the only way to find is by footprints maybe. If we check it, it won't search by it. Should it affect that much? What are your views on it?
I would probably check it. There are some platforms where it just won't make a difference. But on some others it will. I would give it a 1-2 day run, and see if it changes your LPM. My instincts are telling me to check it.
@crownvic - I recall that you have quite a few sites and wanted to ask you something about registration. Do you register the sites under different names (like with godaddy) or does this not matter? Or do you use a privacy for registration?
@AlexR privacy is really a must thing. I don't like anyone spoofing through my contact details. Also huge increase of spammers these days, needless to say. Many registrars like Name.com even has coupon code for free privacy.
Comments
i got kaspersky !!
I think you should first review the forum for threads that have to do with high/low LPM. So many have put up screenshots and have been helped. I know the answers are there for you. It's for sure in your settings assuming you only use non-public proxies.
I use 119 different engines and today was running at 175 LPM. I think most of my gains came from picking efficient engines, and going through that process something like every 3 months or so.
Yes, of course you have less LPM if you isolate the contextual tiers, or the reverse for the junk tiers. If I isolate the contextual tiers where I have strict limits, I probably average 10 LPM on those. But you should expect that for those projects.
@Pratik, this is where @sven probably should answer the question.
@Sven, the tick box to "Always use keywords to find target sites":
1) Can you please explain, if you have this box checked, how and when does SER use this?
2) And what happens to the keywords you have in SER if that box is not checked - does SER ignore our keywords?
@ron
1) If checked, the program searches for new target sites with one of your keywords in the query. ALWAYS
2) If not checked it will only use the keywords in a query if that engine requires it by setup (Blog comments, Trackbacks).
Thanks @Sven.