Skip to content

Crawl for urls should have option to use proxies

Hello, crawling URLS/Anchors should have abiltity to use proxies when using this as otherwise chances are the server may see acivityas bot acvitity and block GSA ser from fetching the urls and achor text

:) Hope you know what I mean. As a lot of servers now have protection and I always often get my own ipblocked from my own site if I fetching alot of urls and anchors using gsa ser.



  • SvenSven
    well usually this server where you crawl urls is your own no?
  • edited December 2018
    @sven indeed but often the server will "detect unusual activity" and block the IP address automatically and if you have a dynamic ip like mine this is a pain each time.Meaning I have to then eithier go on the site fill out recaptcha or go into the cpanel and unblock myself which can be a bit of a pain.  Also might be a problem if I choose to crawl my blogs hosted on platforms such as wordpress or tumblr as if you have an auto blog it might cause something to ping and cause it to be detected as a auto blog by the system for instances.
  • SvenSven
    next update will allow you to use proxies
    Thanked by 1content32
  • edited December 2018
    Thank you! Having a little trouble though as the use proxies option is gray and unclickable for me?
  • SvenSven
    ups sorry, need to change that (silent update).
  • @Sven
    is there any way to bulk crawl (other websites) and store the links crawled? i mean, extract all the links of many websites.
  • SvenSven
    you have that in options->advanced->search->Import URLs (holding site lists)
Sign In or Register to comment.