Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

1235710

Comments

  • 1) if your verified list isn't that big i would try to choose the submitted list first. however i collect all lists in global options but just using verified or submitted list for my projects. i do that because i like to collect some URLs which may be usefull for scripting/modifying some engines later on, so this isn't needed for the average user.

    2) project options -> tools -> export/import -> options


  • @LeeG: I might have another tweak for you as you are using hotmail. Some sites are blocking hotmail accounts so it might be interesting to see, if you are using AOL mail or Yahoo.co.uk for that instead. 
    This won't have such an impact most propably, but you never know until you try it ;)
  • LeeGLeeG Eating your first bourne

    I need to have a few days to recover my own sanity at present.

    Plus as well as ser, I have been pushing my eyes a little to hard with all the testing and monitoring.

    One test Im doing will take four days to hit full potential, so an easy day here and there is well deserved

    And I refuse to swap out my email addresses and risk the wrath of ron and his waiting on revenge for my own reasons why you dont touch them.

    I wont give him the chance to gloat :D

    I refuse

    The total and utter shame it would bring upon me

    Today in under 15hrs, I have done over 20k verified and over 170k submitted

    I might just leave things alone for a bit now rather than try and push it even more

  • @LeeG

    I see you decided to keep pushing SER until you unlock the full power of it!  : )

    How many sites/urls are you promoting though ?
  • LeeGLeeG Eating your first bourne

    I own a few sites. Not as many as some on here though

    Most of those sites Im building links to all inner pages

  • edited January 2013
    Hey guys,

    Thanks for all the great info on this thread I"ve managed to increase my LpM to about 40.  But I've having trouble getting to higher I was wondering if anyone can shed any light?  I've tried to be as detailed as possible with my settings.  thanks in advance!

    1) Running 20 shared private proxies (bought through buyproxies)
    2) Running about 18 projects in total (about 6 projects with 3 tiers).  Using the scheduler to do run 10 projects at a time every 30 mins.
    3) I got 188 threads going.
    4) Using global lists (verified only)
    5) I've tweaked my engines to only submit to the ones which have > 10% Idenitifed / Verified rate as specified by @Ron
    6) I query search engines every 10 seconds
    7) Projects generally use between 138 search engines (english only) and 1000 search engines (all) for the lower tiers
    8) keywords are about 2000 per project (in some cases 100k for lower tiers).
    9) I only verify my links once ever 1440 mins


    Cheers

  • LeeGLeeG Eating your first bourne

    I use five googles per project. All you do is pull more repeat results imo by using more than you need.

    A lot of the engines are just google or bing spin offs

  • edited January 2013
    Cool thanks.  I'm worried I'll get banned my google if I do this.  I saw your post above about using international but all my proxies are US proxies so even if I go international i'm sure I"ll get banned....

    At 5 seconds delay between search engines I got my proxies banned already after running only for 3 days so I bought another set of proxies today.
  • LeeGLeeG Eating your first bourne

    Its only google.com that shows the google from the proxies country

    A lot of the google bans people get is because a lot of people hit the same versions of google if your on shared proxies

    Five people all on the same shared proxies, all hitting google us or uk every five seconds, will get those ips slapped.

  • ronron SERLists.com
    edited January 2013

    One day @LeeG will discover that he could have been doing 500k submitted per day if he only changed his emails like @Ron suggested. This will be my revenge.

    @sonic81 - I think your list looks pretty good. I think @LeeG is likely onto something with the smaller number of search engines. I have experimented, but not enough in this area. I'm going to try and get bolder and maybe use less than 10.

    When I want to see if my changes are making a difference, I try to change only the 'no limits' lower tiers. They make 90+% of the links, so it tells me very quickly if I am on to something.

  • spunko2010spunko2010 Isle of Man
    edited January 2013
    For a newbie like me, this thread is really useful, thanks. I am considering using a VPS server in the next week as my CPU is maxing out at 95% and its holding me back. LeeG, did you set it up yourself, was it easy, etc - I use Linux VPS normally so a bit of a learning curve to install SER on a Windows VPS :S

    Also what kind of sites do you guys own where your SEO requires you to be doing 100k+ per day? Unless I'm naive then not many KWs  can be that competitive surely? Is this a long term thing for most of you?
  • LeeGLeeG Eating your first bourne

    Changing emails is about as much use as a chocolate teapot in my opinion.

    When emails are blacklisted, so are the ips used with them

    spunko2010, its very easy, there are videos on youtube showing the basics of running a windows vps

     

    And just to prove the 300,000 is possible.

    image

     

  • LeeG Im so jealous on those article url`s o.O

    http://prntscr.com/qoer8

    Your suggested settings was good. As you can see on my SS.

    The number of my submitted increased massively :)


  • So I just experimented a bit with using only 8 Google SE's like you lee, but it didn't work out that good. Probably because I only use 5 private proxies compared to your 30 proxies. Because GSA reduced the amount of threads to ~5 ( I have it set at 50 ) everytime it went in search mode. This is probably because it waits 10 sec between each search, and with only 5 proxies and 8 SE's it must wait quite some time between searches, atleast thats my guess. I also had this problem when I had more SE's selected but GSA would only reduce the amount of threads to amount ~20 not as low as 5.

    Anyway I went a total different route: Selected 170 SE's ( 8 random googles + all english SE's ), and only use proxies on posting ( I figured with 170 SE's selected there would be quite some time between each seaches on a SE so my ip wouldn't be banned hopefully ), and its working pretty good. LPM doubled and it always uses 50 threads in search mode.

    Now the nex thing I want to tweak is deselect the bad performing platforms. The thing is I deleted my sitelist 2 weeks ago and now it only has 3k backlinks ( I only use bookmarks, article, social network & web 2.0 ). Do you think a list of 3k is big enough to make a decision on what the bad performing platforms are or you think I should let it get some more samples. I guess I can already deselect all the web 2.0's except xfire as this is the only web 2.0 platform I got backlinks from.
  • ronron SERLists.com

    That's a rather small list, but since you confined it to a small group of platforms, I think you will have enough to weed out the worst of them. Your estimation on how to handle web 2.0 is pretty spot-on.

    Just remember that many web2.0's can still be used so long as you have a human captcha service. So it would be worthwhile to separate out a web2.0 project to be solved by DBC or whatever. Of course, this would only be for T1 links to the moneysites. And then you can make decisions with that new information.

  • Hi lee

    Which proxy provider you are using? how much for 30 proxies
  • AlexRAlexR Cape Town
    @LeeG
    1) How many threads do you think you can run per set of 10 fast private proxies with only 5 Google engines selected to ensure your proxies stay active?

    What about using "identified" for the lower tiers so that it can sift through them and double check if they really are bad sites. By using identified, you can save all the parsing and slowly migrate them up the list. Then every few weeks, just clean out identified because it's obviously only left with the ones that can't get submitted. 

    @all - don't you think it would be a great feature:

    Tools: 
    1) View Platform Posting Success %
    2) Export Platform Posting Success %.
    3) Disable all platforms with success below xxx%? (like in CB where you can set solve rate to be x%)
  • Lee, to what articles are you posting? all or some expecptions?
  • edited January 2013
    >>> 3) Disable all platforms with success below xxx%? (like in CB where you can set solve rate to be x%)

    +1 @GlobalGoogler , that was the exact feature in my mind yesterday after reading LeeG recommendations.
  • @globalGoogler that would be a nice feature indeed and would save me the time of checking platforms across 7 different servers and optimizing for each.   It's a pain in the ass to do this  lol.
  • OzzOzz
    edited January 2013
    nice feature in theory, but you should be aware that you need a large sample size for that. also i see problems occur after you've re-verified your links for example and all guestbooks are gone, because the page has moved (= low verification rate).

    i'm not sure if this auto-disable-feature will do more harm than good. instead of those i think it will be more usefull when its possible to merge all stats into one file with some percentages to get a quick overview. so its easier for use to decide when we are feeling comfortable with the size of our database and know what we are doing.

    EDIT: forget my first paragraph as re-verifying won't touch the site list. but i stand to my point that an 'auto-disable' feature do more harm than good for average users when they are detecting it
  • LeeGLeeG Eating your first bourne

    Now I know what can be done through my own submissions and a lot of experimenting, Im now working on another idea, which will probably drop the submission rate, but improve verified

    Any kind of automated monitoring of engines used can be affected by too many variables to make it worth while implementing.

    You might hit 1000 sites that are moderated

    You might have a bad run on captchas being solved

    The more results you have to work with, the better your own judgement call on what and what not to submit to is.

  • Well the 6 servers I had to check I've had running for at least 3-4 months a piece.  Running 24/7  So I had a good amount of data to deal with.  Just got a 7th today and I'll let that one run for at least 2 months before I go in and decide which engines to disable etc.

    but one thing I did notice when optimizing the platforms across the servers is that it was mainly all the same platforms that were performing very very badly.

    I've thought about either learning how to script the engines and maybe improving them or just hiring that out.  I hardly have the time to do it myself so the 2nd option is more likely for me.   I know there are others that are coming out with services like this  but I'm impatient so if I can do something about it I might as well do it.
  • @hunar 7 engines very impressive :)  

    i'm hitting about 40k submissions consistently daily now small change compared to some guys here but over a 200% increase rate for me so happy for last weeks tweaks.

    what tier 2 : tier 1 ratio and tier 3: tier 1 ration do you guysall use?  how do you control this ratio do you limit thenumber of links you build to tier 1 daily?  thanks!
  • LeeGLeeG Eating your first bourne

    All it takes is a little bit of work and time to boost submissions.

    A private forum I belong to, I have put guys into similar increases.

    Once you start looking more into it, you can find other areas you can tweak, add extra search terms to increase the amount of search results you get each time 

  • LeeG  What LPM means ?

    Coz mine is too low o.O

    http://prntscr.com/qsu8r

    HELP!
  • Links per minute
  • rodol  ok .. so my links per minute is 0.12

    damn . Im so noob :(
  • do you guys verify? i found that when verifiying GSA waste a lot of time, how you set up When to verify?
  • on tiered link pyramid, verified target URL is used for the tiers below it.
Sign In or Register to comment.