Skip to content

how can i avoid scraping my own sites content?

googlealchemistgooglealchemist Anywhere I want
edited March 2023 in GSA Content Generator
the output filter allows us to filter content based on keywords found...

but how would i prevent my own sites from being scraped?

if i have a site that sells blue widgets...and my primary keyword is blue widgets...and im already ranking at the top, or the first page at least somewhere...and then use CG to scrape articles for blue widgets...with the search engines at least its going to be pulling all the content from my site

how do i add my domains to a blacklist filter or how would i otherwise sort this out?

thanks

Comments

  • SvenSven www.GSA-Online.de
    in project options you can use filters and there also one for URLs.
  • googlealchemistgooglealchemist Anywhere I want
    Sven said:
    in project options you can use filters and there also one for URLs.
    in the filter/modify tab for each project we click add, manual, then in that 'search for' box we add our domains. and in the drop down action box we select url skip. and in the drop down applies to section we select article?

    in this case does 'url' cover the whole domain or just the specific page url that i enter? can i just enter the root and have it skip the entire site?


  • SvenSven www.GSA-Online.de
    edited October 2022
    you need to use masks here if you want a whole domain being skipped like https://baddomain.com*
    Thanked by 1googlealchemist
Sign In or Register to comment.