There has been some discussion about checking deindexed sites (through index check) This is similar idea but an improvment on the existing verify links function in SER - to fetch and test the HTTP status response code of verified links:
Could be implemented in: Right click campaign>Show urls>Verified>Status Check
Then fetching the url header (getHeaders) to see if it returns a 404/501/403 error or a 200 OK, then an option to delete these links (404/501/403) from the verified links list. Similar to the function blog analyzer in scrapebox or fetch url in Google webmaster tools.
This would save building subsequent tiered links to 404 pages and help improve the efficiency of the campaign in general, doesn't need to be compulsory just optional for the user, so you could can check once per week and clear out bad links.
Does this function work if I just select all and re-verify the links?
I tested this and blogs that came back as unverified (red) and they had a 200 OK status and were live. It tends to be the all the engines SER is having trouble retrieving the correct status.
In SER out of 350 urls 43 were returned unverified (error).
In Scrapebox Blog Analyzer:
9 had a 404 response
4 had a 501 response
2 had a 403 response
2 had a timeout
26 were returned as 200 OK
I did notice in some cases Scrapebox Blog Analyzer was throwing up an occassional 404 error which was infact 200 OK (probably due to the site owners configuration), but overall getHeaders seemed to be more reliable than the verify links function in SER. I think that sites that have redirects/pop ups or malware warnings and some JS are blocking SER from retrieving the correct link/site status or there is some sort of bug there with this function.