@baba - It isn't so much about adding new footprints as it is about removing the bad ones. I did it across all engines, and I would estimate roughly that I got rid of a third of the footprints.
As per @Ozz statement, I've already made $117.38 while sleeping! Lol! This new CB made me laugh any time. This sure will boost the sales of CB. Great Sven.
Too bad I forgot the freak settings about getting public proxies every 4 hours to feed gscraper. SER crashed so I had to end its process... Lost the submission and verified ones. Damn!
Well, till yesterday I was eager to update whenever new version is realeased. But now I'm scared and tired as hell to re-update all my engine files
Please you let you me know if my configurations of CB and SER about captcha are corrects? I've applied the same conf like @gsarver, but my LPM is 0.5. I've also edited engine file. So I guess it's due to captcha parameters
- 'only solve is success rate is 55' <-- where is that coming from? lower that to 20 or 10 (default), imo. depending on how many retries you are doing you might be surprised how the solving rate increase when doing the math.
1 retry = 2 tries to solve a 20% captcha overall = 36% solving rate.
if you are doing 0 retries than 55% might be ok, but it feels a bit to high in my opinion.
at least for your T1s i suggest to use 1-2 retries. for the kitchen sink 0 retries is ok.
however, i doubt that the settings of CB is causing such a low LPM. feels more like your proxies are a problem and/or your engine selection plus filters.
please read every single post in this thread and also those in the sticky 'Compiled list of tips..'-thread. thats the best advice everyone could get as its all in there.
edit: just saw you that you 'skip' a form field if it can't be filled. change that to 'random'.
@SEOMystic: You should change Skip option to Choose Random as Ozz said. About CB, I personally don't filter the % solve rate. My option is unchecked.
About editing your engine files, pull out the footprints that give low performance. Search for "use blog search=1" and change it to 0.
Only check the engines that gives you the best results, cut out the ones gives like "hundreds" of verified... It was hard for me to decide to uncheck any engines, because well yeah, diversify our platforms/engines is good but I already have enough, which give me good results.
Also about email settings you didn't mention, but hotmail gives good results. I'm using hotmail/live/outlook as stated in my post on previous page.
That might take a couple of hours but it sure will increase your LPM.
About me, my LPM has again dropped to ~90-100LPM I thought it could maintain the ~190 this morning for the whole day..*sigh*... But anyway, it's 10 times faster than yesterday already xD
I use 25 private proxies. I've checked them with SER and they seems to be good.
OK, I'll apply changes about captcha. I don't exactly check the same plateforms like @Alex, I think, I have to uncheck again some plateforms. (it hurts!!:()
guys thank you a lot for your help. I'll print my result shortly.
@audioguy: Well your LPM is wayyyy higher than me, comparing to the threads I'm runnning is 8 times higher than yours. I'm running 1000 threads with 600 private proxies. It's now stable at 90-100LPM. I swear to myself I won't touch GSA products today to see the results for the whole day. Lol. Just a sneak peek:
lower your threads to 250 for a while to see if its really make a difference compared to your 1000. i think someone posted here a few days ago that "lowerish" threads hadn't such an impact.
I've edited engine footprints, not just the use blog search.... now really uncomfortable with updates -___- I will update when there's some more important things. In the next 9 hours I will post my screenshot for 24hr run of SER. Currently: ~93k S and 27k V. Oh by the way I saved 234$ with CB lol!
Alex, just rename your files. You are making this way too complicated. Then obviously have those modified ones checked in projects. And then you never have to worry about regular version updates overwriting your engine files.
Next, you want to keep track when @sven actually changes something inside an engine file. So keep a little text document or spreadsheet with the names of the engine files (that you have modified) and the 'date' of those original files. Then all you have to do is occasionally look for date changes to the original ones you modified. The reason you would do this: @sven does improve engine performance when there's a problem, so you want to make sure you update those improved engine files. Then you can turn blog engine off in that new file, and save it as your new modified file. The files with that particular code is very small.
The actual changes to engine files are not too frequent, so this is a thing you can manage without extraordinary effort.
@Alex what type of footprint query that you remove from the engine? I almost had the same option setting like you (use CB settings too) but only get like 5 lpm, so pathetic.
I use 300 thread, 40 semi-dedicated US proxy, html timeout 120s, GSA runs on a dedicated server 24GB Ram, 1 Gbps port.
Literally scratching my head on how to improve lpm. 8-|
@audioguy: No basically I choose the engines that gives me the high submission. As you know verified links are quite hard to guess, so I just choose what I can submit the most, the hopefully they should get verified. Guess I'm lucky to have that 30% verified though...
@Monty: I bet you just select all the platforms and all the engines, right? Like I already mentioned above or on previous page... Just choose the one that give you highest performance. Filter out the "hundreds", just check what gives best results. Then on that best results, filter the shit footprints - like giving 100k results, choose footprints that have a huge amount of search results. That's what making my LPM higher. There are just footprints you need to filter, I don't get the "type" of footprints you said above.
I'm running at 500 threads only, but it still gives me around 90+ LPM. While 1000 threads was killing my CPU and just gives me stable 105LPM. But aww it's just a VPS, should I kill it by raising to 1500 threads xD
@Alex right on the money! i use almost all the platform inside Article, Blog Comment, Directory, Forum, Guestbook, Image Comment, Microblog, Social Bookmark, Social Network, Wiki.
Trying to filter out the shitty footprints engine now. Oh one more thing, do you use your own list of keywords or the keywords scraped from SB using the campaign particular niche? Let's say i ran a project on SEO, should i use the scraped keywords regarding that term or just using a keyword list that has nothing to do with that?
@Monty: It's all your choices... If you want to submit to niche-relevant sites, use your keywords. If not, use some popular freaking keywords like "weightloss fast", "payday loans" or so keyword scraping is also your choice, if your main keywords aren't enough.
Comments
LeeG you also edit engines footprints and insert your own or leave them by default.
I edited the footprints on all the engines I use
On gscraper pro, there is a tool built into that, which shows search results for footprints
Only in the pro version though
Went through and killed all footprints that have low results
But I have noticed a lot of the engine files now that are in ser have got a lot better
LeeG you are using private proxy fro scraping through gscraper or public? if private then how much and from where?
and how much minimum result you sent for any footprints in gscraper?
again Thanks for answering my quries.
I don't use gscraper that much.
I getter better results letter ser run on its own
Im version hopping at the moment after seeing a big drop in Verified over night
Something I know Ozz used to hate. So I post comments like that from time to time just to wake him up
My LpM dropped to under 140 with the present release
But, we all get good and bad days
Alex, just rename your files. You are making this way too complicated. Then obviously have those modified ones checked in projects. And then you never have to worry about regular version updates overwriting your engine files.
Next, you want to keep track when @sven actually changes something inside an engine file. So keep a little text document or spreadsheet with the names of the engine files (that you have modified) and the 'date' of those original files. Then all you have to do is occasionally look for date changes to the original ones you modified. The reason you would do this: @sven does improve engine performance when there's a problem, so you want to make sure you update those improved engine files. Then you can turn blog engine off in that new file, and save it as your new modified file. The files with that particular code is very small.
The actual changes to engine files are not too frequent, so this is a thing you can manage without extraordinary effort.
What version of windows are you running, that looks like the tardis version
Just had to check the date on my computer in case I had missed a few days here and there
Those stats are for verified and not submissions