how can i avoid scraping my own sites content?
googlealchemist
Anywhere I want
the output filter allows us to filter content based on keywords found...
but how would i prevent my own sites from being scraped?
if i have a site that sells blue widgets...and my primary keyword is blue widgets...and im already ranking at the top, or the first page at least somewhere...and then use CG to scrape articles for blue widgets...with the search engines at least its going to be pulling all the content from my site
how do i add my domains to a blacklist filter or how would i otherwise sort this out?
thanks
but how would i prevent my own sites from being scraped?
if i have a site that sells blue widgets...and my primary keyword is blue widgets...and im already ranking at the top, or the first page at least somewhere...and then use CG to scrape articles for blue widgets...with the search engines at least its going to be pulling all the content from my site
how do i add my domains to a blacklist filter or how would i otherwise sort this out?
thanks
Comments
in this case does 'url' cover the whole domain or just the specific page url that i enter? can i just enter the root and have it skip the entire site?
https://baddomain.com*