Cleaning up Target URL's that return download failed
I'd like to clean up my site lists that come up as download failed when I run them through GSA.
I have figured out that the website itself has nothing but a registered domain and is parked.
Unfortunately, it returns a 200 code when I check the server header.
The best I was able to do was use ScrapeBox Broken Links Checker addon. When I ran it it returned an Error 403 code. Not sure if this means an HTTP 403 code.
Is there anyway to identify these type of sites and remove them?
I have figured out that the website itself has nothing but a registered domain and is parked.
Unfortunately, it returns a 200 code when I check the server header.
The best I was able to do was use ScrapeBox Broken Links Checker addon. When I ran it it returned an Error 403 code. Not sure if this means an HTTP 403 code.
Is there anyway to identify these type of sites and remove them?
Comments
However I think it easier in this case to use one of multiple tools SER has built in.
If you go to options >> advanced >> tools - you will see the option to "clean up" which might do what you want, or you can do an import urls (Identify platforms and sort in) and it will allow you to save the valid ones. There are some other tools there that could probably help, but they will sort things out for you.
On another note, it is a pleasure to meet the brains behind ScrapeBox . Last night I was having serious breakthroughs in the program and absolutely LOVE it. Thank you so much!!!
Your welcome for the help. Id love to take credit for scrapebox, but I don't actually work for them, SweetFunny and 1 other are the people that run it. I do however make lots of tutorials for it as Scrapebox has helped me out a lot, so glad you found them helpful.
@s4nt0s
Glad to be here mate! Thanks for the welcome.
Just a thought..