Skip to content

Settings

AlexRAlexR Cape Town
edited August 2012 in Need Help
Hi

I'm running a VPS with 2gb Ram and a 6.80GHZ processor.

I run SB, and Tweetadder, GSA & CS

I have set GSA to run at 550 threads, 150html timeout, and 3s custom time, and 6 proxies. VPS is running at 1.1gb ram on average. I have 150 active projects in GSA.

My main goal is to get high quality links so have it set to PR2+, max 100OBL.

My question is:
1) Can I increase GSA above 550 threads? (what are others running at?) Can I push it to 800 to 950?
2) Can I reduce the custom delay to 1s? (This just seems far too low!) But given the number of projects and I have all English SE's enabled (around 120) maybe it's fine? How would I test if it's fine at 1s?
3) With this many threads should I reduce the html timeout or set it to max of 180?

(Just seems from the forums that everyone is using far more conservative settings here and wondering if I am missing something!) 

Any help is appreciated...

Comments

  • SvenSven www.GSA-Online.de

    550 threads is far beyond the recommended number. I would never use more than 100 but it's up to you.

    Setting the maximum timeout is recommended. Also using a very low wait time between se queries is not recommended unless you have a lot private proxies.


  • AlexRAlexR Cape Town
    1) How can I see if I have set the max timeout too low?
    2) How can I see if I have too many threads? How do I test this? Surely the more threads the quicker more data you can get through to find the links you need? (doesn't the number of threads depend on the network connection and VPS's ability?)

    Thanks



  • OzzOzz
    edited August 2012
    you can also lower your maximum website filter to <10 mb if you want faster submissions.

    I don't know if this is useful for you, but I've set all my Tier2+3 projects to use no SE and feed them with "use global site list" only. It's working for me as the Tier1 project does all the search and fill up the global site list for the other Tiers. 

    "1s" search query time is a bit to risky with 6 proxies only imo. 35 SE of "all english" are from Google and they will get used a lot I think. Furthermore I would give you the advice to test the SE seperately from each other to uncheck the engines that are not giving you good results.

    Some SE give only Page 1 results as they don't have "Page 1", "Page 2", "Page 3" to add more results. Instead they have a "show more" button like yandex.com or duckduckgo. GSA can't handle this AFAIK.

    Other SE have slow servers and will slow the search time down. 

    I would give you the advice to test them with "search online for urls" and an easy keyword like "powered by vbulletin". Write down the results and gather your conclusion.

    less SE with good results + higher search query time > more SE with propably worse results and lower query time IMO
  • I did a SE test myself a couple of days ago and this were my conclusions. As Sven improved some engines ("blekko" for instance) you should review the results by yourself again.

    I did some research about the international SEs with the "search online for url"-tool and keywords "powered by vbulletin" and"leave a comment" website. 

    Here are my results without dupes (KW1|KW2):
    -google (453|394) -> very, very fast
    -msn/bing (145|17) -> wtf??. good speed though 
    -yahoo (855|825) -> very fast
    -hotbot (850|807) -> moderate speed
    -ecosia (849|792) -> very fast
    -info (677|607) -> moderate speed
    -lexxe (573|454) -> moderate speed
    -metacrawler (416|416) -> moderate speed
    -sky (406|305) -> moderate speed
    -euroseek (430|287*) -> slow. *idle mode after some time (aborted)
    -exalead (94*|599) -> moderate speed. *idle mode after some time (aborted)
    -mamma (241|209) -> moderate speed
    -ask (192*|129*) -> *idle mode (aborted)

    crappy results: search, searchhippo, jayde
    only page 1 results: blekko, dogpile, exite, gigablast, ixquick, scrirus, yandex
    not working: aol, charter, clusty, lycos, scrubtheweb, teoma, thunderstone, verizon, volunia
  • AlexRAlexR Cape Town
    I have also set maximum website filter to <3 mb. I only want clean sites without thousands of comments. (I have now set it to 5 to see how it handles)

    What search query time would you advise? 

    Like your idea on "I don't know if this is useful for you, but I've set all my Tier2+3 projects to use no SE and feed them with "use global site list" only. It's working for me as the Tier1 project does all the search and fill up the global site list for the other Tiers. "

    Almost all my projects are tier 1, so can't really use that. I will use that later when I do some tier 2 promotion if needed. Great idea BTW. :-) 
  • AlexRAlexR Cape Town
    @Ozz - I take it you do not use the use all English SE's option?
  • No. Right now I'm using:
    INT: blekko, ecosia, euroseek, exalead, google, google blog, hotbot, info, lexxe, mamma, metacrawler, msn, sky, teoma, yahoo
    UK/US/...: all the local ones of the above SE + altavista (not tested), izito, tiscali (not tested) and zapmeta
  • AlexRAlexR Cape Town
    Thanks!

    What search query time would you advise? 

    Also can I set the maximum website filter to 2 mb? What are your thoughts on this?

    Also - what are you thoughts on threads? Surely having more spread out over 100 projects will be fine?
  • OzzOzz
    edited August 2012

    What search query time would you advise? 
    My theory is: 

    Search Query Time = 60/#proxies/#SEs+SafetyTime

    - 60 seconds is the time for a safe Google search query time
    - divided to the number of proxies you use
    - divided to the number of different SE you have checked. remember that Google INT, Google UK and Google US should be seen as only ONE SE
    - safety time is for the case that some of your proxies might stop working or GSA handles the SE threads not in a certain order (search query 1=google Int, sq2=hotbot, sq3=googleUK, sq4=googleUS => 3 SQs in a very short time to google). this is just buffer which could be between 0 (high risk) to 10 (low risk)




  • AlexRAlexR Cape Town
    Thanks Ozz - really appreciating the input here.... :-)

    1) What are you thoughts on threads? Surely having more spread out over 100 projects will be fine?
    2) Maximum website filter to 2 mb? What are your thoughts on this?

    Once I have run these for a while will post a thread and see the kind of results others are getting with different settings. :-) No point leaving VPS resources unused or GSA SER not fully maxed out... ;-) 
  • OzzOzz
    edited August 2012
    I really can't tell you much about how many threads you can use. Maybe some other "high volume ranker" can give you better advice.

    About the filter. I've set mine to 7mb but I don't know what is best, because it is hard to compare those settings from each other. If your results with 2mb are fine or better let us know though :)
  • AlexRAlexR Cape Town
    Will run some tests and let everyone know. 

    Only thing I don't understand is when Sven said " I would never use more than 100 but it's up to you." 

    Surely this depends on the CPU and Ram and stress on the VPS? Bigger VPS = more threads..why the limit of 100?
  • About threads.
    When running to a prescraped list, for posting or checking platform and adding to masterlist, on a similar VPS to yourself, i find 100 to be the max it can do without running at max cpu most of the time.

    However on the odd occasion i use it in scrape and post mode, 100 threads hardly register on the cpu.
    I presume this is because the scraping part is 'blocking' threads whilst they obey the wait between searches thing, so less threads are doing cpu intensive calculations at any one time.

    This is just a guess, as i have no idea how GSA SER allocates threads, what runs async etc.

    It would be nice to have a thread count on the gui, maybe in the bottom bar SB style, and even nicer to separate them into active and sleeping threads.
    However i realize this is likely to cause 5 users to say 'oh, nice thanks' and all the rest to email Sven asking why half the program is asleep!
  • AlexRAlexR Cape Town
    Thanks for the reply. 

    I haven't tried just posting to a prescraped list. 

    I just set it to 600 connections, 2s interval and 150 timeout. 

    Runs like a dream with CS and Tweet adder running as well. (sometimes I drop the above to 550 threads).

    :-)

    Will let you know after another week on the VPS status and if it runs perfectly still. 
  • However i realize this is likely to cause 5 users to say 'oh, nice thanks' and all the rest to email Sven asking why half the program is asleep!

    I would say thank you AND ask why half of the threads are asleep. Would be nice to really USE the resources which are available.
Sign In or Register to comment.