Skip to content

Cleaning up Target URL's that return download failed

I'd like to clean up my site lists that come up as download failed when I run them through GSA.

I have figured out that the website itself has nothing but a registered domain and is parked.

Unfortunately, it returns a 200 code when I check the server header.

The best I was able to do was use ScrapeBox Broken Links Checker addon. When I ran it it returned an Error 403 code. Not sure if this means an HTTP 403 code.

Is there anyway to identify these type of sites and remove them?

Comments

  • looplineloopline autoapprovemarketplace.com
    Well you could find common text and use the page scanner in scrapebox to qualify them. 

    However I think it easier in this case to use one of multiple tools SER has built in. 

    If you go to options >> advanced >> tools - you will see the option to "clean up" which might do what you want, or you can do an import urls (Identify platforms and sort in) and it will allow you to save the valid ones.  There are some other tools there that could probably help, but they will sort things out for you. 
  • Thank you so much loopline, I used the SER tool to identify platforms and it worked great.

    On another note, it is a pleasure to meet the brains behind ScrapeBox :). Last night I was having serious breakthroughs in the program and absolutely LOVE it. Thank you so much!!!
  • s4nt0ss4nt0s Houston, Texas
    @loopline is in the building! Good to see you here Matt :)
  • looplineloopline autoapprovemarketplace.com
    @Live4TheRisk
      Your welcome for the help.  Id love to take credit for scrapebox, but I don't actually work for them, SweetFunny and 1 other are the people that run it.  I do however make lots of tutorials for it as Scrapebox has helped me out a lot, so glad you found them helpful. 

    @s4nt0s
     Glad to be here mate!  Thanks for the welcome.  :)
  • Is the ser tool clean up as accurate as the scrapebox?
    Just a thought..
Sign In or Register to comment.