1 GB Ram isn't that much. I suppose you have to limit threads to like 30 in the beginning and see how much CPU it uses. Increase your thread count from that until you feel that it doesn't use 99% of the CPU all the time.
Disable identified and failed > big button top right of the screen "Options", then click the advanced tab
On each project / tier. 2nd tab, the one where you choose which search engines to use, enable use global sites lists and only tick submitted and verified
10 projects on 30 minute rotation
I run a similar amount to you and I push about 15k an hour submitted
Lee can you share the initial thread where you coach us thru most of this. My Ques.. Under Filters -do we leave all unchecked if we are not keying in on PR and OBL's. So if I want to blast tier 2 for ex leaving these unchecked will yield more...I cant;' seem to find your initial thread. Thnks!
Up to 16-17lpm now after time+energy....LOL Guys is CB better way to go than CsX. I have Csx and tested CB but have not purch yet. I likely will just because of Svens dedication alone...
@m1xf1 - You won't regret it. One thing about @sven is he constantly improves and adds things users want every day. No software developer comes even close to his dedication. Happy rankings bro!
@LeeG now I have 11.43 LpM. It's amazing for me :-) I have ~40% verified of submitted links so its very good for me! Thx for all tips and tricks !
But I know that I can get better results (50 LpM is my dream - but it can be possible, I know! :-P) I play all day with settings (Thread's HTMl Timeout) but with not better results.
Which search engine you have checked? Only google?
Ive had this software since March last year before this forum was even going. Utilized it just never understood it like I do now from reading the sharing and posts from here. Really good stuff guys and thanks again. I've had it since 1.0 days! Its worked on rankings too, now to expand its POWER!
I just wanted to say that I have now moved the cardboard cutout of @LeeG to the corner in my office, and now have @doubleup looking over my shoulder.
Seriously man, that is beyond ridiculous. Any tips are welcome. Like what made the biggest improvements in LPM for you. I would love to hear your input.
Yesterday I had ~11.5 LpM, then I changed search engine to only Google, LpM let down to 9.50, I restore my search engine and LpM still let down (now I have 9.13) why ?
@dudz1ok: and tomorrow you get 11 again and then 10 and then 12 and then 10 again. who cares if everything is kept within reasonable bounds?
there is just nothing you can do sometimes as there are good and bad days depending on what engines SER is trying to post to, how many target urls it finds with the footprints etc.
Edit: "When I will have 5 LpM I will be VERY happy"
now you have doubled your goal and not happy with that?
One thing often missed is the number of campaigns. If you have 5k verified locations and one campaign you can get an easy 5k links. If you have 100 campaigns you'll get 500,000 links and your LPM will be higher because you're doing more posting and less searching.
1) "Start with html timeout - I set mine to about 130 > 140" : Why is this so high? My understanding that this is the time it takes for a single site to load. So having this at this level is like saying "please wait for over 2 mins for this site to load. Yes, there may be a bad script or on a slow server or whatever but please keep waiting" - why not drop it to 30s and only post to sites that load within 30s. Even at 30s that is a long time for a website to load. Websites should be loading in under 10s! Would love to get your opinion on this.
2) Regarding Football - the last WC in RSA, watching Germany play England was a pleasure...football at it's best....and I don't watch that much football! ;-)
If Global site lists is checked and its a relatively new install, after the targets are basically used up (submission and Verified) does SER go out and search then off keywords?
@globalgoogler I set my HTML timeout to 180. This is the portion that will make you money...loading the site and submitting the link. I'm willing to wait as long as it takes. I've already spent time harvesting, matching engines, etc so I want to make sure and get every link.
When running a lot of threads, you would expect a site to have some delays or scripting errors that would cause a lag. Another benefit is that most people won't wait 3 minutes for a site to load which means less OBL's across the domain.
Speed is king to my teachings. Why waste time waiting for one site to load when in the same time period, you could submit to five others in that period
Slow loading sites will also be penalised by google. So very little link value
Ok so I decided I had GSA running long enough and gathered enough samples to cut out the engines that aren't performing well, like suggested in this thread.
I created an spreadsheet where I put the verified & succesful stats next to eachother and I rougly only selected the engines with +100 verified and +10% verified to successful ratio. You can find the spreadsheet here: http://www.mediafire.com/view/?tiloahgaciz9df1, if anyone has any suggestions on other engines to remove/add please let me know.
I did make an exception for article directory pro. This engine had 20% ratio but only 2 verifieds. So I decided to check this one out and it turns out the engine file uses terrible search terms: "RSS Feeds" "Add us to favorites" "Make us your home page" "Submit Articles". These terms are way to generic and I added: "Powered by ArticleDirectoryPro" ( text that is on the bottom of the page ). So hopefully it will find more results.
@pietpatat . . . I happen to be doing the same thing myself . . . you can check out my results here . . . it's a google Doc spreadsheet. I actually setup a little MySql database with the raw data and did some queries on it. I based it on verified vs identified results. I only ended up eliminating 22 platforms so far. We'll see how that works. I basically eliminated anything with an identified count > 50 and verified links < 5 , and a "percent failed" greater then 70%
**Edit**
Just realized I had a few mistakes in my SQL . . . just updated a new version of the spreadsheet here
I've done something similar to you guys but using a slightly larger sample size Also took the liberty of highlighting low verified % platforms by color to show severity.
Might be good to note: I haven't come close to making it through all the links I've fed SER so far. My list is approaching 3 million now, so my verified percents should appear pretty low overall.
Comments
Oh thats it, im off in a sulk
Try and get on top of the intermittent crashes
Whats the spec of the machine your running it on?
CPU type, memory, operating system, internet connection speed
7 projects for 20 minutes?
So your running scheduler mode
How many campaigns and tiers in total
Are you using global sites lists. Enabled both in projects and advanced setting
Disable identified and failed > big button top right of the screen "Options", then click the advanced tab
On each project / tier. 2nd tab, the one where you choose which search engines to use, enable use global sites lists and only tick submitted and verified
10 projects on 30 minute rotation
I run a similar amount to you and I push about 15k an hour submitted
300 threads on a 9mbt line is a bit daunting
Best I could do on a 6mbt was 100 threads
You need to open resource monitor and watch your cpu and internet connection
Use that to adjust your threads to maximise your connection speed
I use both submitted and verified, then again, Im beyond hope
My latest tweak Im testing might be a good un.
300k submissions a day was so yesterday. I might have found how to squeeze a few more submissions a day out of ser
Your verified to submitted ration looks good when you compare it to my ratio taken over the same period today
Now you just need to take your time playing with the settings to try and boost them
Start with html timeout
I set mine to about 130 > 140
Then work on your engine search engine choice. Some have said they jumped from 20k a day to over 100k a day, just by making changes there
If your using global sites lists, also look at what engines your submitting to give poor results
I only use GSA Captcha Breaker, so my choice of engines is aimed at what I get results from using that as my only decaptcha service
I also have a lot of tweaked engines files to find more target sites and allow for my high submission rates
Big options button top right > advanced options > tools > show stats
From there, you can see what your submitting to and getting verified and the engines that are giving zero results.
Just killed ser, when I checked which buttons to press
Be all my bloody counters reset to zero now ffs
Not quite so bad, looks like Sven has the counters writing to file.
So if you have a bad moment, they are partly kept up to date
That magic calculation that upsets some
time + energy = high LpM + Submissions
No magic buttons used
I posted some test data on here once. With three variations used
CB + CSX > 150k
CSX + CB > 85k
CB > 200k
Im still sulking after the earlier crash, I droped from 245LpM to 176LpM
So decided to do todays updates. 10 new captchas in CB Plus a new engine with the captcha type already added to CB
And you ask if Cb is worth buying. :O
Even if the CSx guys give you a free pet puppy to try and get customers back.
They cant compete with those kinds of daily releases
CB is superior. I tried both and there is no comparison.
The one thing that really irks me is that @sven and @ozz keep making improvements - and suddenly CSX has these new changes. LOL.
Do you want the innovator, or the copycat?
A lot of the captchas CSX uses, only got added when Sven released the sharing tool for their software
And they would then add shared captchas found on here
I only started getting into seeing what it could do when i got a vps last year
A tweak here and there and a lot of experimenting
I have owned it from before Santos started the discount thread on BHW
Im still sulking, before my crash earlier today I was doing 245LpM and looking at an easy 350k today
Now Im going to be lucky to scrape past 300k
Spend the time, study what can be tweaked and daily submission like this can happen
Its Hunars fault Im back to doing straight 24hr runs and getting results like that again
The guys a trouble maker
I just wanted to say that I have now moved the cardboard cutout of @LeeG to the corner in my office, and now have @doubleup looking over my shoulder.
Seriously man, that is beyond ridiculous. Any tips are welcome. Like what made the biggest improvements in LPM for you. I would love to hear your input.
@doubleup cool stats
@LeeG cool stats & thank you for sharing your tweaks - you are THE MAN
Thats it dudz1ok, now you have upset Ozz
I know your site is about football, please dont say it also includes the 1966 world cup
He will eat you alive
Not that I would dare mention that fine year for football either
Time to put my tin foil hat on and hide
Damn, England 0 Germany 1
When running a lot of threads, you would expect a site to have some delays or scripting errors that would cause a lag. Another benefit is that most people won't wait 3 minutes for a site to load which means less OBL's across the domain.
Speed is king to my teachings. Why waste time waiting for one site to load when in the same time period, you could submit to five others in that period
Slow loading sites will also be penalised by google. So very little link value
Mine is set to 130
HTML time out is used in two places, submitting and verifying
Most people who suffer low verified, its down to them using the standard setting of 60 from memory
Plenty of time to submit, but not long enough to verify
There are a lot of factors which can affect the speed of verifying
Speed of proxies
Internet connection speed
Private proxies
Public proxies
Etc
I created an spreadsheet where I put the verified & succesful stats next to eachother and I rougly only selected the engines with +100 verified and +10% verified to successful ratio.
You can find the spreadsheet here: http://www.mediafire.com/view/?tiloahgaciz9df1, if anyone has any suggestions on other engines to remove/add please let me know.
I did make an exception for article directory pro. This engine had 20% ratio but only 2 verifieds. So I decided to check this one out and it turns out the engine file uses terrible search terms:
"RSS Feeds" "Add us to favorites" "Make us your home page" "Submit Articles". These terms are way to generic and I added: "Powered by ArticleDirectoryPro" ( text that is on the bottom of the page ).
So hopefully it will find more results.