Crawl for urls should have option to use proxies
Hello, crawling URLS/Anchors should have abiltity to use proxies when using this as otherwise chances are the server may see acivityas bot acvitity and block GSA ser from fetching the urls and achor text
Hope you know what I mean. As a lot of servers now have protection and I always often get my own ipblocked from my own site if I fetching alot of urls and anchors using gsa ser.
Thanks.
Hope you know what I mean. As a lot of servers now have protection and I always often get my own ipblocked from my own site if I fetching alot of urls and anchors using gsa ser.
Thanks.
Comments
is there any way to bulk crawl (other websites) and store the links crawled? i mean, extract all the links of many websites.