Skip to content

Proxies, HTML Timeout, Threads - Max Efficiency

2456710

Comments

  • LeeGLeeG Eating your first bourne

    nicerice the lines your looking for are

    ;0=no, 1=yes, 2=only blog search
    use blog search=1

    I just delete them myself

  • I just figured it out the exact moment you posted this. I will do a quick mass edit using notepad++ (backing up the whole universe BEFORE I do so just in case. Thanks again LeeG!
  • LeeGLeeG Eating your first bourne

    I know I sound extra cautious at times.

    But some people are beyond help at times and giving the information is like giving them a lighter in a fireworks factory

    You can also add extra footprints to boost searches and results.

    I was playing for a day to get things to a stage I was happy with.

    Make an edit, run for a good few hours etc etc

    Cut out using scrapebox or the built in scraper and just add the footprints straight to those files.

  •  GlobalGoogler  LeeG  Ozz  : Hey guys . someone mentioned he/she selected more than the default SE`s selected. My question is , on what particular SE`s should i select? Im confused what to select :\ 
  • I will dig up extra footprints now ... the first change has already increased speed noticeably.
  • LeeGLeeG Eating your first bourne

    There are a lot of different opinions on what to use, what works best.

    I prefer the rule of less is more.

    Google, no matter what country you use, can produce up to 100 results per page.

    And all countries return the same results, give or take 6 to 8 positions

    Some engines by default, use the blog search engines. Something that I personally have made changes to files so they are not used

    A lot of engines also draw their search results from the main players, there are very few independant engines out there. Independents are Yandex in Russia and Baidu in China, plus a few others.

    Yahoo is Bing. Plus there are a load of others that feed off google and bing

    My own recommendations would be four to six random googles, plus four to six random google blog search engines.

    But thats just me going against what everyone else recommends

    The more obscure the better, so you have less chance of the ips you use being banned on them for too many search queries

  • LeeG  : I am running 100+ AD projects now on SER . Any setting recommendation on 
    1) How many private proxies you using?
    2) How many SE's have you selected?
    3) How many threads are you running?
    4) What's your custom HTML timeout?
    5) What's your custom search time between queries?

    so i cant get more results?

    coz Im just getting 100+ results a week :/
  • GiorgosKGiorgosK Greece
    edited January 2013
    lrichard2112
    - 10 private proxies with 90 threads for me has tripled verified
    - use less restrictive options for finding TARGET URLS
    - start low on the threads (20-40) every few hours go up a few threads until you come to the point where the computer does not freeze (CPU being always on 99% is not good)
    - use more keywords so you can get more targets
    - experiment with more search engines
    - html time out should be a little above the thread count I believe (100 threads - 105 html time out)
    - I usually leave the search time between queries alone
  • AlexRAlexR Cape Town
    @LeeG - photoshop was definitely a joke. Even the German's caught it as such. ;-) 

    I always read your posts..very informative!

    What an epic discussion. 

    I'm running some SE tests to see how to improve this area. 

    I'm looking for a system to measure keyword overlap. I.e. "blue widget" vs "lovely blue widget", has a big overlap. Basically, trying to find a way to measure this overlap and only select keyword with a set uniqueness %. Does anybody have any ideas here?
  • LeeGLeeG Eating your first bourne

    If you use scrapebox to scrape keywords, chances are you will end up with a lot of keywords along those lines

    Adding different google engines to each project is a great way to learn how many different countries / regions there are and how many countries are censored with no access to it

  • 1 - 100 shared proxies. I also use them for harvesting within scrapebox
    2 - N/A. I only use global site list
    3 - 300 threads, sometimes 500
    4 - 90 seconds, or 120 for 500 threads
    5 - N/A, as i don't allow GSA SER to search for me
  • AlexRAlexR Cape Town
    @LeeG "I'm looking for a system to measure keyword overlap. I.e. "blue widget" vs "lovely blue widget", has a big overlap. Basically, trying to find a way to measure this overlap and only select keyword with a set uniqueness %. " Does you have an idea on how to check/resolve this?
  • 1) How many private proxies you using? - 30
    2) How many SE's have you selected? - 156 English
    3) How many threads are you running? - 50-150
    4) What's your custom HTML timeout? - 120
    5) What's your custom search time between queries? - default

    In less than 24 hours i got around 10,000 verified links with 6 projects + 1 tier each, a total of 12. Btw before to start any project i recommend first to scrape for website lists and import them, this is how i did and believe it or not i get daily tons of verified links.
  • btw ... How to set SER to only search and not submit ?

    the opposit to only submit and not search you can archive through disable all SEs and uncheck the 2 checkboxes (always use keyword to find target sites + also analyse competitiors posts)

  • ronron SERLists.com

    The most valuable tip I could give somebody that wants to put their submission count into hyperdrive is:

    Set the custom verification on "no limit" projects to "Custom - 7200 minutes".

    When you build links on lower tiers that have no limits, you will end up with projects that have tens or hundreds of thousands of submitted links to verify. If you just let GSA verify when it wants to, it will spend considerable time going through this process each day.

    If you pace out when GSA does this, your submissions will go through the roof. I'm at 2,000 per hour on just 125 threads.

  • Thanks Ron,  I'm trying that out now. :)
  • @ron killer tip. Thank you bro
  • GiorgosK  thanks on this amazing tips :)

    ron hello ron . I didnt get what you explained :( Where can i set the custom verification?

  • edited January 2013
    The best tip: get fast private proxies (PP) with repsonse < 1 sec. I had 40 PP with response > 2 seconds + 360 threads settings more or less tuned ;-) and got 20k submits + 2k verified each 24 hours. with the same amount of PP but repsonse < 1 sec ==> 130k submits + 15-30k verified each day.

    360 theards on a dedi with is ~ 50% CPU usage, running other tools parallel. so you can hit much higher numbers if you want to get crazy here.
  • ronron SERLists.com
    edited January 2013

    @lrichard2112 - Pick any project and open it...It's under the Options tab toward the top of the page under "How To Submit / Verify".

    @TOPtActics - Fast private proxies and threads do make a big difference. Delay the verify function and you will have a lot more.

  • @TOPtActics how do you test your proxies? Scrapebox? I can't see the response time in GSA
  • ron : see my SS , is this right?
  • hyde mind if I answer that?
     I usualy test proxy on my tools eg. UD

  • ronron SERLists.com

    @lrichard2112 - You got it right!

    @hyde - The response times are all listed in GSA. Options=>Configure. 2nd column from the end.

  • ron  super thanks ron :)
  • TOPtActics i put down my verify time from 5sec to 1sec as you suggested and getting much better submit / verify rates now ... what a "few" seconds can do ;) thx
  • AlexRAlexR Cape Town
    Hi Guys


    I'm just wondering how many are affected by it. ;-)
  • edited January 2013
    LeeG,

    Thanks for that tip. I had it checked the whole time, thinking it was "safer," but never stopped to realize there isn't much downside to leaving it unchecked because most sites will prevent duplicate registration anyway.

    Do you do this with upper tiers as well, or only lower tiers for some reason?

    LeeG said:

    If you want a quick blast of easy links. On the lower tiers, make sure you dont have a tick in the "dont post to the same domain" or how ever its worded.

    Most websites wont let you register a second time. If its a forum, your already registered etc

    Social bookmarks, ser will log into the accounts you already have and add more links to those, making the accounts look more natural. Rather than one bookmark one account

    Its any easy way to add extra links. And if your using global sites lists, from time to time, links will be added to those accounts you already have set up.

  • edited January 2013
    @ron "Set the custom verification on "no limit" projects to "Custom - 7200 minutes".

    thats 5 days..I read in another thread  that SER will delete your submitted links after 5 days if not verified [unless you tick 'don't remove urls'..I am not sure how it works..but if you set verify after 5 days chances are SER will delete your submitted links as not verified and when actually it start verifying links, there are chances some of your links already been deleted from submitted list.

    For safer side I would set it at 3000
  • AlexRAlexR Cape Town
    @LeeG - "Social bookmarks, ser will log into the accounts you already have and add more links to those, making the accounts look more natural. Rather than one bookmark one account"

    1) To do this, don't you have to clear your target URl History/cache? 
    2) Does it do it automatially?
    3) Or do you have a set of master accounts that you keep using?

Sign In or Register to comment.