Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

1457910

Comments

  • Amazing thread! Have learnt tons of things just right now  :D Thank you everyone. I have a newbie question here.

    I don't care much about LpM now cause I'm not looking to blast everything at the moment. I'm in a dripping mode now :D What I do care is my verified/submitted ratio. At the moment I have 20 private proxies and I mix them up with public ones in GSA from 3-5 good sources (thanks @Ozz for the tutorial). 

    So my question is, would it help improve my verified/submitted ratio if I use only private or private + semi-private? 
  • Ron, when you set custom verification, do you check "Don't remove URL", "Remove after 1st verification try" and "Also re-verify existing backlinks"?
  • i have 13 lpm constantly with single project
    over 100k submitted
    over 17k verified

    damn my rankings not increasing at google

    i am using max 50 outer link, bad keywords and bad sites filterings so pretty good quality backlinks
  • ronron SERLists.com

    @system0102, no difference between private and semi private. I still average about 15% verified for submitted, but that is measured across all links including the pure spam. I'm sure the % is higher on my T1's - I just never measured only that.

    @zuluranger, I don't check any of those boxes ever for any tier.

     

  • @ron thanks! Gonna cancel my subscription with buyproxies and get 20 private + 30 semi private from some other guys :D
  • ronron SERLists.com
    Most here only use semi-private. I think private is a waste of money in my opinion.
  • OzzOzz
    edited April 2013
    @system0102
    >At the moment I have 20 private proxies and I mix them up with public ones in GSA from 3-5 good sources

    don't do that. avoid public proxies at any costs. they are good for nothing if you have a couple of private proxies at hand.
  • LeeGLeeG Eating your first bourne

    I have been testing more ideas again :D

    Believe it or not, scraped list feeding slows down my submissions

    I pulled a massive list of scraped sites off the net to test the idea. About 4 million after duplicates were removed. Split the list into 10k files and then fed it in equal chunks to all lower tiers, then endured several days of 100LpM while they were being used by the projects

    100LpM and I was sweating like a sweaty thing that's sweating a lot :O

     

  • That's interesting LeeG, I am following all your tips (when I get time to play around) and I have gone from 8 to around 25lpm. Plan to spend the weekend sorting out all my engines and a bunch of other stuff.

    I always thought scraped lists would be much faster but I think it will be worth spending more time on my GSA settings vs the scraping option.
  • LeeGLeeG Eating your first bourne

    I have my own theories on why it seems to work that way with me

    Every time I have fed lists of scraped sites, LpM is about half of its normal running

  • edited April 2013
    @Velocity Don't get put off site lists.

    As it depends on the quality of the list to be honest. Its logical, that if your feeding the program a list of URL's where it can specifically place it's backlinks, then this would be far faster than if you allow the program to manually search for them itself. There's no way searching for them within the program can be quicker than actually giving the program what it wants in the palm of its hand.

    If you feed it a good list, you'll get a high LPM count. If it's a bad list, where not many compatible engine URLs are found, then your LPM count will be low. Typically, when using the global site lists, my LPM count is >250LPM. However, i've seen 600+LPM when importing URLs directly into the project. If i sorted out my global site list, i could also get those figures, but i haven't got time for that. It all comes down to the quality of your list.
  • I think sven have to create a banner at the home page and put a big image that said....

    ATTENTION! PUBLIC PROXY SUCKS...


  • @ron @Ozz @darman82
    Thanks for tips. What would you guys prefer if you had these options:
    1) 30 private proxies
    2) 20 private + 30 semi-private
    3) 60 semi-private

    Recently, I decided to test each engine type in separate projects just to see what is going on. So instead of putting all contextual links in one project, I created 4 projects for each engine. All settings and data are the same for projects. I heard on this forum that web 2.0 sucks, and yes, there is an issue there I believe. Cause since very beginning the project with web 2.0 only has grey status most of the time. I don't know why 

  • From what i remember, LeeG said he only using 30 semi-private proxy so. 30 is enough cause i was using 30 private proxy right now. ( i use to use 100 semi-private proxy)

    So far so good.. the LpM was still the same and the proxy is fine. So i suggest you to use options one but use semi-private proxy instead of dedicated proxy cause it more expensive.

  • ronron SERLists.com
    30 semi private is what I have been using all along. 250 - 300 threads. 5 seconds. 160 HTML.
  • @leeG can you please explain this.
    i am try to understand but can't.

    "

    If you do that, some engines like the blogs are set by default to use blog search engines.

    So add a random selection of blog search engines, or as I have done, edit the engines files so they use any search engine. About an hours work to do."

  • Hi,

    1) How many private proxies you using?
    100 private proxies and a lot located in France
    2) How many SE's have you selected?
    38, please find the SE I use for French language :
    google
    google BE
    google BF
    google BI
    google BJ
    google blog
    google CA
    google CD
    google CF
    google CG
    google CH
    google CM
    google DJ
    google FR
    google GA
    google GN
    google GQ
    google GW
    google HT
    google KM
    google LU
    google MC
    google MG
    google ML
    google NC
    google NE
    google PF
    google PG
    google PM
    google RW
    google SC
    google SN
    google TD
    google TG
    google VU
    google WF
    MSN FR
    Yahoo ca
    3) How many threads are you running?
    320
    4) What's your custom HTML timeout?
    120
    5) What's your custom search time between queries?
    5

    I put 1500 french regular keywords
    I use CB retry 3
    then deathbycaptcha if PR 6 mini
    then askmebot

    GSA SEO indexer with my 100 private proxies and 500 threads

    i use my proxies for everything include PR checking.
    I had a dedicated server with 4 threads but performance was too high, so I just changed for a dedicated server with 24 threads and Internet connexion 300 Mbit/sec

    What do you think of the 38 SE ?
    What do you think of my config ?


  • "My own recommendations would be four to six random googles, plus four to six random google blog search engines."

    In Edit / Options/ Search Engines to use /    how choose google blog search engines ?
  • edited May 2013
    1) How many private proxies you using? 250 - semi dedicated 
    2) How many SE's have you selected? 1 - google international only 
    3) How many threads are you running? 800 
    4) What's your custom HTML timeout? 120 
    5) What's your custom search time between queries? 4 sec 

    Currently I'm having around 15 LPM but I'm in the phase of importing scraped sites into projects so well it took time for SER to realize platforms, submitting or else. I'm happy with this speed. 

    Most of the time SER consumes 15% CPU, 600MB RAM. I'm thinking of increasing threads to 1000 or even higher because I have 250 proxies and many resources left.
  • i think you should put more work on how to set up your projects instead of increasing threads.
    i know this depends on which types of platforms you are submitting to, but 800 threads and 15 LPM seems to be very low. you can get that LPM with 10 shared proxies and 50 threads.
  • Hi,

    if my email is blacklisted then what should i do?

  • keep it or change it whenever you feel its necessary (daily, weekly or monthly is your choice)
  • LeeGLeeG Eating your first bourne

    Don't people know that if they use an email that's been caught in a black list for one millisecond their testicles will fester, fall off and be used by dwarf aborigines as targets on the blow pipe target range

    How many times I have posted proof on this and why its less time consuming to do nothing and still get results

    https://forum.gsa-online.de/discussion/comment/20762/#Comment_20762

  • So what you're saying @LeeG is you don't recommend using blacklisted emails? :))

    Glad we cleared that one up. (That is such a graphic warning - love it!)
  • LeeGLeeG Eating your first bourne

    Its a question that gets asked time and time again on here

    And as per normal I can normally pull a few examples from stop forum spam to prove a point

     

    This guy has been busy getting on there

    http://www.stopforumspam.com/ipcheck/142.4.112.122

    But nothing compared to this ip

    http://www.stopforumspam.com/ipcheck/212.59.16.171

     

    Replace your emails and how long until they are listed again?

    1 million profile links or 1 profile link

    You could spend a week swapping them out and bang, first link and your listed again

     

     

  • edited June 2013
    Ok. I haven't posted much. I have spent most of my time quietly reading this forum, testing, tweaking and testing again. Lots of good advice in these threads, so thank you everyone!

    Following the advice of @LeeG, @ron, @Ozz and others here I'm finally getting some respectable LPM and results. I'm also using LeeG's engine advice and have only selected 7 engines. Here is my LPM and CB screenshot from today:
    image

    @LeeG & @ron These are nothing close to the outrageously high numbers you guys are getting, but the results are from various tweaks and settings advice found here, so thanks and cheers guys. :D

    Also, I do want to comment that this is only running ONE project. No tiers. Just one lonely project optimized the best I can, so I'm pretty happy about my wimpy 47 LPM considering ;)

    Anyway, great advice from all the pros here. I will be here regularly; learning, testing and hopefully with some experimenting I can develop some ideas to contribute as well.

    UPDATE: Been running for a few hours and the LPM is still slowly climbing. I doubt it will hold 71 LPM over a couple days, but its been hovering around 68-72 for some time now.
    image
  • ronron SERLists.com

    That number is sick for one project! Very high. You must have a mixture of comments in that project if I had to guess.

     

  • @ron since you're using 30 proxies and 5s to search with G, are you having problems with bans? Some user(me included) are facing this problem since the last update...
  • LeeGLeeG Eating your first bourne

    The only time I have suffered the pr-? of doom is when I used buyproxies.org proxies

    Everyone says how good they are, so loads of people use them

    Remember, shared proxies, you never know what the other people sharing them with you are using them for.

    I still rate the quality of Proxy Hub proxies

    And all the time the sheep jump ship to buyproxies, the less people use Proxy Hub and in the long run, the longer they last.

    I get 40 proxies a month and on the odd occasion, I get one or two swapped out

  • OzzOzz
    edited June 2013
    i wonder if Google have combined proxy bans for PR checking, searching and recaptcha for instance? in the past those were handled seperately but things might have changed. 

    so if you use proxies for all things mentioned above you may get banned earlier. this is just a theory though and i'm not hit by proxy bans so far, but i don't use any PR filter anymore and skip hard to solve captchas as well.
Sign In or Register to comment.