Come on Lee , you have only 2 choices here. You can either Start a paid training service and let us get the knowledge you have OR stop posting your screenshots / talking about your submissions and stop teasing us !!
@LeeG - I did an analysis on all the engines, and ended up dumping 124 engines. The copy engine feature is a huge deal. I hope people get it...
@everyone - If you haven't done it, grab the data on your identified vs.verified, dump it in a spreadsheet, and get to work. You will clearly see where SER is wasting a lot of time, whether web 2.0's where the signup page code has changed, or engines that use recaptcha, to engines that just don't have that many web properties, etc.
I Personally love what Lee is doing. He is pushing me to learn the software more because now I see what it can do. Which in turn will keep pushing me more and more.
If we post a feature request to sven to add auto detect engine that use recaptcha and auto disable those engine. I think this is really help to increase the submission and verify link. Since i don't planning to use recaptcha also.
Wow some amazing info in this thread! Already implemented some of the things lee suggested and can't wait to see if it will change my results. Planning on deselecting some bad performing engines in the future as I was dumb enough to delete my sitelist last week so I will need my sitelist to grow a bit bigger before I can tell the bad performers for sure.
"If we post a feature request to sven to add auto detect engine that use recaptcha and auto disable those engine. I think this is really help to increase the submission and verify link. Since i don't planning to use recaptcha also."
very bad idea imo as you will end up with just a couple of engines.
reCaptcha can be installed to basically every platform script. with a feature like this you will disable every engine once SER finds a site that has not installed the default captcha and reCaptcha instead.
1) If you ran a project with PR1+ PAGE (not domain) blogs, 50OBL, how many links would you likely generate per 24 hours?
2) If you ran a project with PR3+ PAGE (not domain) blogs, how many links would you likely generate per 24hours?
3) If you ran a project with PR3+ DOMAIN articles, how many links would you likely generate per 24hours?
I haven't set up all my lower tiers yet, so kinda tricky for me to compare to the numbers you're generating. Would like to get your rough estimate for the above...
this is an awesome thread guys. top stuff kudos to @leeg@ron and @globagoogler. i've gotten my LpM to 40 links per minute still tiny compared to u guys but big increase for me!
going nuts on my tier 2 and tier 3 trying to juice up my tier 1s.
Pretty sick to see that diagram go way up. Are the first few months ( with the relativly low numbers ) on the same vps with the same amount of proxies, and the number really started increasing when you started tweaking right? Have you checked lately how big your sitelist is? Guess that must also be getting enormous.
I just went through and compared my lists from Options -> Advanced -> Tools And took at good look at which platforms are not performing and at all. I gotta say I'm quite astonished at just how many platforms are dead or not working more specifically the web 2.0 I think outta all of them only 5-6 Work? I understand a lot of that goes with Captcha's etc. etc. but still Even in the forum And guestbook there are a ton that are just not working that well some that have well over 20k submissions but only like 5-20 verified.
I can't imagine all that can be due to just captcha's? Are the platforms goofy does the programming for these need to be updated or what?
They're changing the signup pages to combat automated signups. And recaptcha is in there too. I only found 2 web2.0's worthy enough to keep.
I didn't just look at the amount of links verified, I also created a % column where I divided verified/identified. I automatically got rid of anything 10% or less because even if you got 1000 links, you had to cycle through 10,000 targets to get that 1000. Now that's inefficient.
Well the majority of the ineffectient platforms are the web 2.0's and that's understandable with all the changes that need to be made to those constantly but the forum for example there was quite a few as well, there is quite a few others as well. I think I might have to read that script manual and other things that ozz had posted maybe i'll try to update some.
Well, Lee. I think i'm starting to catch up on ya. I'm pushing over 125 LPM now and it's pretty consistent until my proxies burn out. The only thing I can think of that you might not be doing that I am. is maybe your not doing any PR checks and just blastin away?
@hyde - I should have clarified that I cut all those web2.0's on the bottom tiers. Using DBC is the way to go on projects to the moneysite.
@hunar - I'm not there yet, but over 50 LPM consistently. I have had SER create links at over 4000-6000 per hour for several hours at a clip, but not an entire day. I'm still tweaking a lot of stuff though.
@ron@hunar@leeg can i ask where the docs in SER or for the copy engine feature? i'm having trouble finding it and working out how to reduce engines which don't perform.
The trick I use to save proxies burning out is to use different google engines as much as possible
Some know that I use one type of search engine only which is google
Each project and tier, then uses 4 different googles from the other projects and tiers and the international google, which sends you to the google page of your proxy.
So if you had an it proxy, google international sends you to google Italy
Spanish proxy and its google es
What this does is stops the constant blasting of searches at one particular search engine.
End result, few if any google bans on a proxy.
@Sonic, if you open up a project to edit it. On the left of the screen, you have the tick box area where you select the engines, ie blogs, article, forums etc. Right click there and the option to copy the engines will show up
Thanks again for all the great info on this thread. Can I ask two questions
1) When you guys use global lists are you only using the verified links list? I ignore the submitted, identified and failed lists (I'm trying to increase my verified links list over time and use GSA resources efficiently post to global lists and if that's done go back to search engines and try to mine more links. Don't want to waste resources on posting links that don't work.
2) Is there a way to copy the settings for each project in the options tab as well?
Comments
Notice Sven is slacking. The verified scale only goes to 27777
Give Sven something to do over the weekend. I found something else to break
I run 27 projects, all with three tiers.
Even though the numbers are big, the links are spread out over a lot of tiers and projects
This might be of interest to some.
A guy on bhw asked about swapping out articles, repeat content etc and as luck has it, there is a good video on the subject.
http://www.seomoz.org/blog/how-unique-does-content-need-to-be-to-perform-well-in-search-engines-whiteboard-friday
My own lower tier content, I use AutoContentWriter
Its cheap and does what it says on the tin. Idiot proof.
And the difference in the numbers in the graphs above. The first I did while ser was running.
Then when I was shutting down ready for the nightly vps reboot, I did the second one
And I then killed my first hours running totals when ser needed a helping hand to stop, it hung on one thread
To try and utilise the sitelist feature to its max, I clone the engines used across all tiers.
Sven has made this easier to do now with a feature for doing it.
My choice in engines is all down to which ones I can get links from
Why waste time and resources posting links to engines that either produce poor results or CB cant be used on
That way if your using the sitelists, cant find results, blocked by google etc, you always have a topped up supply of places to post links to
@LeeG - I did an analysis on all the engines, and ended up dumping 124 engines. The copy engine feature is a huge deal. I hope people get it...
@everyone - If you haven't done it, grab the data on your identified vs.verified, dump it in a spreadsheet, and get to work. You will clearly see where SER is wasting a lot of time, whether web 2.0's where the signup page code has changed, or engines that use recaptcha, to engines that just don't have that many web properties, etc.
If we post a feature request to sven to add auto detect engine that use recaptcha and auto disable those engine. I think this is really help to increase the submission and verify link. Since i don't planning to use recaptcha also.
Do you guys think it was a good idea?
What I did was look at my stats to see which engines produce links
Then base my choice around that
You will see some have low numbers like 20 or 30 verified
Under the advanced options, there is a stats monitor to see your submitted and verified rates
And if you have the right set up with ser, fast cpu and connection, 300k+ should be easy
Thats my own opinion after todays 12hr mark. Which can go wrong when I reboot my vps and loose the days totals
Try it and see
Then share your results
I think you can say 1/4 million submissions can be easily achieved if you take the time to learn how to use the software.
288,106 submissions in 24hr
Remember that total is 10k short on what it should have been when I had problems shutting ser down last night
With the 10k I lost off the counters, 298,106
Verified is crap in all honesty
Something that I need to work on
What ever Sven has tweaked on the memory for 64bit operating systems, he has done it well
Thats the first 23hr straight run I have done in a long time
And the stats digram to back up the above
First few months I was running it on a home pc with a limited internet connection
Even then, for a six meg connection, I could get reasonable results considering the limitations
Its only in the last couple of months that I started to use a vps.
Thats when I started playing with ser to see what it could do and unlock its potential
Sitelist I have killed a few times
Then using Options > Advanced > Tools > Add urls from projects
You can re populate the sitelists
Whats the point in keeping the identified?
Submitted and verified are the ones your looking to submit to
Ideally just hitting verified.
In the coming months, that will be my next move to get a higher verified rate
But it also buys a break from hitting the search engines too hard and getting your ip banned for too many search requests
I just went through and compared my lists from Options -> Advanced -> Tools And took at good look at which platforms are not performing and at all. I gotta say I'm quite astonished at just how many platforms are dead or not working more specifically the web 2.0 I think outta all of them only 5-6 Work? I understand a lot of that goes with Captcha's etc. etc. but still Even in the forum And guestbook there are a ton that are just not working that well some that have well over 20k submissions but only like 5-20 verified.
I can't imagine all that can be due to just captcha's? Are the platforms goofy does the programming for these need to be updated or what?
They're changing the signup pages to combat automated signups. And recaptcha is in there too. I only found 2 web2.0's worthy enough to keep.
I didn't just look at the amount of links verified, I also created a % column where I divided verified/identified. I automatically got rid of anything 10% or less because even if you got 1000 links, you had to cycle through 10,000 targets to get that 1000. Now that's inefficient.
@hyde - I should have clarified that I cut all those web2.0's on the bottom tiers. Using DBC is the way to go on projects to the moneysite.
@hunar - I'm not there yet, but over 50 LPM consistently. I have had SER create links at over 4000-6000 per hour for several hours at a clip, but not an entire day. I'm still tweaking a lot of stuff though.
The trick I use to save proxies burning out is to use different google engines as much as possible
Some know that I use one type of search engine only which is google
Each project and tier, then uses 4 different googles from the other projects and tiers and the international google, which sends you to the google page of your proxy.
So if you had an it proxy, google international sends you to google Italy
Spanish proxy and its google es
What this does is stops the constant blasting of searches at one particular search engine.
End result, few if any google bans on a proxy.
@Sonic, if you open up a project to edit it. On the left of the screen, you have the tick box area where you select the engines, ie blogs, article, forums etc. Right click there and the option to copy the engines will show up
Tell me about it.
I did over 80 times by hand. Over four hours that day making alterations
Then someone asked for the feature to be added