Skip to content

[Feature request] GSA CG threads for checking

Hi,
I would like to request that the feature to "set amount of threads" be separated into threads to use for search engines and threads to use for checking urls/other sites.

GSA CG could be a lot faster at checking things if URLs support https, etc., if we could set a higher amount of threads independently of what we generally would use for search when using private proxies to search, and trying to not get them banned. Checking URLs on sites other than search engines don't require the use of any proxies necessarily, and you wont get banned normally, so more threads could be used.

Thanks


Comments

  • SvenSven www.GSA-Online.de
    added for next update
    Thanked by 1the_other_dude
  • edited June 2022
    Thank you!! Now my proxies won’t get banned so much when I am searching google and I forgot to decrease threads after checking before
  • googlealchemistgooglealchemist Anywhere I want
    Thank you!! Now my proxies won’t get banned so much when I am searching google and I forgot to decrease threads after checking before
    hey im curious on your settings. why multiple threads but 20 sec delay?
  • Thank you!! Now my proxies won’t get banned so much when I am searching google and I forgot to decrease threads after checking before
    hey im curious on your settings. why multiple threads but 20 sec delay?
    1 thread per 10 proxies is what I use when scraping google using dedicated datacenter proxies. In this case I had 30 proxies.
  • googlealchemistgooglealchemist Anywhere I want
    Thank you!! Now my proxies won’t get banned so much when I am searching google and I forgot to decrease threads after checking before
    hey im curious on your settings. why multiple threads but 20 sec delay?
    1 thread per 10 proxies is what I use when scraping google using dedicated datacenter proxies. In this case I had 30 proxies.
    thanks for the feedback, thats really interesting...and confusing to me. i have 150 dedis and thru trial and error had to set a 4 second delay at one thread to avoid being banned
  • edited July 2022
    Thank you!! Now my proxies won’t get banned so much when I am searching google and I forgot to decrease threads after checking before
    hey im curious on your settings. why multiple threads but 20 sec delay?
    1 thread per 10 proxies is what I use when scraping google using dedicated datacenter proxies. In this case I had 30 proxies.
    thanks for the feedback, thats really interesting...and confusing to me. i have 150 dedis and thru trial and error had to set a 4 second delay at one thread to avoid being banned

    That's just how I have been scraping with scrapebox for about a decade to avoid bans since the time they got tough on scraping. So I used the same settings with CG. My proxies do not get banned that way, or at least not very often. I may not be using the tool the same way you are either. I don't use the article scraper much - currently.
Sign In or Register to comment.