Skip to content

[Request] - Don't allow posting to duplicate domain across group of projects

I know myself any many others have processing "groups" specifically meant for filtering in freshly scraped lists in which we will import a huge file of freshly scraped targets across multiple (20+) projects and let SER do it's thing. The problem is, during these processing stages domains get pounded and it increases the OBLs on the sites and it contributes to a faster death of them as well. For some imports this isn't a problem since naturally we dedupe by domain on scrape, but for other scrapes it's not logical to dedupe by domain and thus we have lots of duplicate contextual domains. To preserve the lifespan of the targets and also to limit the amount of OBLs you are hitting them with, I think this would be an awesome feature for SER to have. What do you think @sven?

Comments

  • SvenSven www.GSA-Online.de

    Thats not going to work the way SER is operationg. Projects are not sharing such informations at all. It would come to a massive slowdown if on each submission it has to check a lot of projects if they used that site or not.


Sign In or Register to comment.