Engine recognition
How is gsa ser able to identify websites? Based on what? Footprints?
Because I cleaned my list of wikis with the clean up option from tools. After that I used those websites (100% of them were wikis, fully working) and I was getting a lot of no engine matches... How is that possible?
Because I cleaned my list of wikis with the clean up option from tools. After that I used those websites (100% of them were wikis, fully working) and I was getting a lot of no engine matches... How is that possible?
Comments
Then, why I got so many "no engine matches" ? I cleaned the list and then I imported the list in my project.. What could be the problem?
Or like Sven said. Proxy along with the actual site errors can be a issue. There's a lot of sites that get suspended for bandwidth, nonpayment, non-compliance, etc etc. Or they just have shitty hosts and go down or take too long to load alot of times.