skip sites with the following words in url/domain - request to make it root domain specific not url
googlealchemist
Anywhere I want
I have a list I've curated of super spammy keywords that i want to avoid getting links from any domains that have them in it...basically porn/gambling/pills type stuff.
i see the main system wide settings has the blacklist for actual domains but within the specific project settings i can add my keyword blacklist in the "skip sites with the following words in url/domain" section.
its the 'url' part that is holding me up. if i have a ton of sites scraped and alot of them were used for building links to the type of sites in the niches i want to avoid...like redirects/indexers/contextual profile urls etc
gooddomain.com/redirect=spammynichedomainurl.com
gooddomain.com/userprofilesuperspammykeywordhere
gooddomain.com/indexersite-spammykeyworddomainhere.com
etc
i still want to get a link from that good domain, so i dont want ser to filter it out based on the full url where its going to see one or more of my blacklisted spam keywords to not build links on
id rather not have to strip every domain in my lists to their root and re import/identify/etc the whole thing to clean this up. other than the resources wasted to do that, i would be worried it wouldnt identify all the same link targets again like it did with the inner page.
i see the main system wide settings has the blacklist for actual domains but within the specific project settings i can add my keyword blacklist in the "skip sites with the following words in url/domain" section.
its the 'url' part that is holding me up. if i have a ton of sites scraped and alot of them were used for building links to the type of sites in the niches i want to avoid...like redirects/indexers/contextual profile urls etc
gooddomain.com/redirect=spammynichedomainurl.com
gooddomain.com/userprofilesuperspammykeywordhere
gooddomain.com/indexersite-spammykeyworddomainhere.com
etc
i still want to get a link from that good domain, so i dont want ser to filter it out based on the full url where its going to see one or more of my blacklisted spam keywords to not build links on
id rather not have to strip every domain in my lists to their root and re import/identify/etc the whole thing to clean this up. other than the resources wasted to do that, i would be worried it wouldnt identify all the same link targets again like it did with the inner page.