Skip to content

scraping limit?

Is there an limit on scraping per project? i set the threads for scraping to 1000 and it wont go over 500/600.

Cant seem to get it any higher.


Comments

  • SvenSven www.GSA-Online.de
    With 10 proxies it will open just 10 connections per search engine.
  • @Sven how can i change this? i cant add duplicate proxies, i have rotating proxies so they have the same adres but once i try to add duplicates it says no new proxies added
  • But i have new 1000++ proxy also scrape the 500++ only.
  • SvenSven www.GSA-Online.de
    I will add a better setting on next update so that you can define the exact timing for search queries.
  • Sven said:
    I will add a better setting on next update so that you can define the exact timing for search queries.

    Do you know about when this update will come?
  • SvenSven www.GSA-Online.de
    it's out already
  • Sven said:
    it's out already
    @Sven What is the automatic delay set to? so i can test different delays.


  • SvenSven www.GSA-Online.de
    That delay depends on the search engine. Some have a 60 second delay, some 1 second.
  • That means u no need to test, all automatic set delay for search engine 
  • SvenSven www.GSA-Online.de
    yes, if you think the delays are too low (e.g. using shared proxies), try using the option to add some seconds in addition.
  • Sven said:
    yes, if you think the delays are too low (e.g. using shared proxies), try using the option to add some seconds in addition.
    @Sven Any way to get the limit higher than 500-600 per project?
  • SvenSven www.GSA-Online.de
    Sorry but what limit in detail?
  • Sven said:
    Sorry but what limit in detail?
    @Sven
    Per project i run i can only get 500-600 threads running thus limiting the completion time. is there anything i can change to get the project that i am runnning fulll recources? or is there an thread limit per project?
  • SvenSven www.GSA-Online.de
    the only limit is the memory usage and proxy amount.
Sign In or Register to comment.