Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

1246710

Comments

  • LeeGLeeG Eating your first bourne

    Notice Sven is slacking. The verified scale only goes to 27777 :D

    Give Sven something to do over the weekend. I found something else to break :D

    I run 27 projects, all with three tiers.

    Even though the numbers are big, the links are spread out over a lot of tiers and projects

    This might be of interest to some.

    A guy on bhw asked about swapping out articles, repeat content etc and as luck has it, there is a good video on the subject.

    http://www.seomoz.org/blog/how-unique-does-content-need-to-be-to-perform-well-in-search-engines-whiteboard-friday

     

    My own lower tier content, I use AutoContentWriter

    Its cheap and does what it says on the tin. Idiot proof.

     

    And the difference in the numbers in the graphs above. The first I did while ser was running.

    Then when I was shutting down ready for the nightly vps reboot, I did the second one

    And I then killed my first hours running totals when ser needed a helping hand to stop, it hung on one thread

  • you dont blast tier 1 and tier 2 with secondary links like bookmarks? or you just use everything on tier 3?
  • LeeGLeeG Eating your first bourne

    To try and utilise the sitelist feature to its max, I clone the engines used across all tiers.

    Sven has made this easier to do now with a feature for doing it.

    My choice in engines is all down to which ones I can get links from

    Why waste time and resources posting links to engines that either produce poor results or CB cant be used on

    That way if your using the sitelists, cant find results, blocked by google etc, you always have a topped up supply of places to post links to

  • Come on Lee , you have only 2 choices here. You can either Start a paid training service and let us get the knowledge you have OR stop posting your screenshots / talking about your submissions and stop teasing us !!


  • ronron SERLists.com

    @LeeG - I did an analysis on all the engines, and ended up dumping 124 engines. The copy engine feature is a huge deal. I hope people get it...

    @everyone - If you haven't done it, grab the data on your identified vs.verified, dump it in a spreadsheet, and get to work. You will clearly see where SER is wasting a lot of time, whether web 2.0's where the signup page code has changed, or engines that use recaptcha, to engines that just don't have that many web properties, etc.

     

  • I Personally love what Lee is doing.  He is pushing me to learn the software more because now I see what it can do.  Which in turn will keep pushing me more and more. :)
  • WOW... i never think about this, how stupid i'm. :(

    If we post a feature request to sven to add auto detect engine that use recaptcha and auto disable those engine. I think this is really help to increase the submission and verify link. Since i don't planning to use recaptcha also.

    Do you guys think it was a good idea?
  • Wow some amazing info in this thread! Already implemented some of the things lee suggested and can't wait to see if it will change my results. Planning on deselecting some bad performing engines in the future as I was dumb enough to delete my sitelist last week so I will need my sitelist to grow a bit bigger before I can tell the bad performers for sure.
  • What are people getting lpm when NOT using global lists?
  • "If we post a feature request to sven to add auto detect engine that use recaptcha and auto disable those engine. I think this is really help to increase the submission and verify link. Since i don't planning to use recaptcha also."

    very bad idea imo as you will end up with just a couple of engines. 
    reCaptcha can be installed to basically every platform script. with a feature like this you will disable every engine once SER finds a site that has not installed the default captcha and reCaptcha instead.
  • LeeGLeeG Eating your first bourne

    What I did was look at my stats to see which engines produce links

    Then base my choice around that

    You will see some have low numbers like 20 or 30 verified

    Under the advanced options, there is a stats monitor to see your submitted and verified rates

    And if you have the right set up with ser, fast cpu and connection, 300k+ should be easy

    Thats my own opinion after todays 12hr mark. Which can go wrong when I reboot my vps and loose the days totals

    image

     

     

     

  • AlexRAlexR Cape Town
    @LeeG -

    1) If you ran a project with PR1+ PAGE (not domain) blogs, 50OBL, how many links would you likely generate per 24 hours? 
    2) If you ran a project with PR3+ PAGE (not domain) blogs, how many links would you likely generate per 24hours? 
    3) If you ran a project with PR3+ DOMAIN articles, how many links would you likely generate per 24hours? 

    I haven't set up all my lower tiers yet, so kinda tricky for me to compare to the numbers you're generating. Would like to get your rough estimate for the above...
  • LeeGLeeG Eating your first bourne

    Try it and see

    Then share your results

  • AlexRAlexR Cape Town
    3.5LPM for 1 & 2. Haven't done 3 yet. 
  • this is an awesome thread guys. top stuff kudos to @leeg @ron and @globagoogler.  i've gotten my LpM to 40 links per minute still tiny compared to u guys but big increase for me!

    going nuts on my tier 2 and tier 3 trying to juice up my tier 1s.
  • LeeGLeeG Eating your first bourne

    I think you can say 1/4 million submissions can be easily achieved if you take the time to learn how to use the software.

    image

    288,106 submissions in 24hr

    Remember that total is 10k short on what it should have been when I had problems shutting ser down last night

    With the 10k I lost off the counters, 298,106

    Verified is crap in all honesty

    Something that I need to work on

    What ever Sven has tweaked on the memory for 64bit operating systems, he has done it well

    Thats the first 23hr straight run I have done in a long time

     

     

     

     

  • LeeGLeeG Eating your first bourne

    And the stats digram to back up the aboveimage

  • Pretty sick to see that diagram go way up. Are the first few months ( with the relativly low numbers ) on the same vps with the same amount of proxies, and the number really started increasing when you started tweaking right? Have you checked lately how big your sitelist is? Guess that must also be getting enormous.
  • LeeGLeeG Eating your first bourne

    First few months I was running it on a home pc with a limited internet connection

    Even then, for a six meg connection, I could get reasonable results considering the limitations

    Its only in the last couple of months that I started to use a vps.

    Thats when I started playing with ser to see what it could do and unlock its potential

    Sitelist I have killed a few times

    Then using Options > Advanced > Tools > Add urls from projects

    You can re populate the sitelists

    Whats the point in keeping the identified?

    Submitted and verified are the ones your looking to submit to

    Ideally just hitting verified.

    In the coming months, that will be my next move to get a higher verified rate

    But it also buys a break from hitting the search engines too hard and getting your ip banned for too many search requests

  • Ok,

    I just went through and compared my lists from Options -> Advanced -> Tools  And took at good look at which platforms are not performing and at all.  I gotta say I'm quite astonished at just how many platforms are dead or not working more specifically the web 2.0  I think outta all of them  only 5-6 Work?   I understand a lot of that goes with Captcha's etc. etc.  but still  Even in the forum And guestbook there are a ton that are just not working that well  some that have well over 20k submissions but only like 5-20 verified. 

    I can't imagine all that can be due to just captcha's?  Are the platforms goofy  does the programming for these need to be updated or what?
  • ronron SERLists.com

    They're changing the signup pages to combat automated signups. And recaptcha is in there too. I only found 2 web2.0's worthy enough to keep.

    I didn't just look at the amount of links verified, I also created a % column where I divided verified/identified. I automatically got rid of anything 10% or less because even if you got 1000 links, you had to cycle through 10,000 targets to get that 1000. Now that's inefficient.

  • I use web 2.0 as Tier 1 and I enable DBC on those. That's where quantity is less important than quality
  • Well the majority of the ineffectient platforms are the web 2.0's and that's understandable with all the changes that need to be made to those constantly but the forum for example there was quite a few as well,   there is quite a few others as well. I think I might have to read that script manual and other things that ozz had posted maybe i'll try to update some.
  • Well, Lee.   I think i'm starting to catch up on ya.  I'm pushing over 125 LPM now and it's pretty consistent until my proxies burn out.  The only thing I can think of that you might not be doing that I am.   is maybe your not doing any PR checks and just blastin away? 
  • ronron SERLists.com
    edited January 2013

    @hyde - I should have clarified that I cut all those web2.0's on the bottom tiers. Using DBC is the way to go on projects to the moneysite.

    @hunar - I'm not there yet, but over 50 LPM consistently. I have had SER create links at over 4000-6000 per hour for several hours at a clip, but not an entire day. I'm still tweaking a lot of stuff though.   

  • @ron @hunar @leeg can i ask where the docs in SER or for the copy engine feature?  i'm having trouble finding it and working out how to reduce engines which don't perform.

    Thanks.
  • LeeGLeeG Eating your first bourne

    The trick I use to save proxies burning out is to use different google engines as much as possible

    Some know that I use one type of search engine only which is google

    Each project and tier, then uses 4 different googles from the other projects and tiers and the international google, which sends you to the google page of your proxy.

    So if you had an it proxy, google international sends you to google Italy

    Spanish proxy and its google es

    What this does is stops the constant blasting of searches at one particular search engine.

    End result, few if any google bans on a proxy.

     

    @Sonic, if you open up  a project to edit it. On the left of the screen, you have the tick box area where you select the engines, ie blogs, article, forums etc. Right click there and the option to copy the engines will show up

  • @leeg thanks mate very useful on the copy makes setting up new projects easier i was doing it all manually! grrr....
  • LeeGLeeG Eating your first bourne

    Tell me about it.

    I did over 80 times by hand. Over four hours that day making alterations

    Then someone asked for the feature to be added

  • edited January 2013
    Hi guys,

    Thanks again for all the great info on this thread.  Can I ask two questions

    1) When you guys use global lists are you only using the verified links list?  I ignore the submitted, identified and failed lists (I'm trying to increase my verified links list over time and use GSA resources efficiently post to global lists and if that's done go back to search engines and try to mine more links.  Don't want to waste resources on posting links that don't work.

    2) Is there a way to copy the settings for each project in the options tab as well?
Sign In or Register to comment.