Alright my first question about how does it behave in the following scenario
1 : I disable search engine usage. And import root sites of 1m websites. Does GSA ser check any other url than those imported 1m urls. Lets say some of them are blogs and blog comments can be made but do i need to provide those blog posts urls for GSA ser to be able post comment ?
2 : How big domain list text file i can import to the GSA ser ? Currently i have like 130m domains and with my software i am checking which one of them alive. My rough estimate so far is about 40m of them. I am planning to import all and let gsa ser to process them. Do i have to need to split files and import each ? How many websites it can handle in its repository ? What limitation i have ?
3 : Are there any option to make GSA crawl added url list to certain depth ? If there is i would like to make GSA to crawl all urls at the given page - depth 1
Ty very much