Skip to content

Why are many failed re-verified links in fact LIVE???

edited August 2014 in Need Help

After using the software for a few weeks (using great bought lists) I noticed that although I get hundreds of verified links every day, the total number of verified links in each project has not been increasing by much at all.

Recently I became aware that you can import failed re-verified links and after I imported them and verified again, I found at least 20-25% links were actually live.

This is what I did:
right-click on project>show verified links>import failed re-verification>verify selected
As I said 20-25% of those failed re-verified links were live.

This is happening in all projects - contextual, bookmarks.

Why is this happening? Why re-verification fails on so many good/live links? Over 22 project I was able bring back to verified (from failed re-verified) over 2000 links. Since I'm building tiers, I want to make sure all my live links are not disappearing from the verified lists.

I use private proxies and use standard setup to verify after 1440 and re-verify after 10000

What is going on?
Thx

Comments

  • i suspected the same thing for couple of weeks.

    So i stopped building links for couple of projects and what i have done is exporting some of the projects verified url and keep doing this for selected projects every single day. I set up status as re-verify only for every 4320 minutes.

    Every selected project has numbers close to what i am gonna lay.

    August 14th verified links: 1641
    August 17th verified links:  1429
    August 21st verified links: 1274
    August 24th verifiied links : 1153

    That is close to %35 link lost. These links are on my elite sites lists so that rate (all contexual article) is very high. Since i already have the verified list from 10 days ago, i pull out only lost links and rechecked them with scrapebox.

    What i have found is out of 488 lost links infact 218 was alive.
    That is crazy number. Almost %40.

    You can also try this. Very easy test. Export verified links. wait couple of days. Reverify that project links. Extract dead ones. Export project TARGET urls. Fİre up scrape box load Target urls as own sites and select "check links". Load up exported verified list. I guarantee you, you will find lots of alive links that are marked as dead.

    @Sven Any chance to look at how to remove verified links procedure please?


  • SvenSven www.GSA-Online.de
    well I can try to optimize this but what else can it do if the link is no longer there, it has to remove the link. The only way this can be fail interpreted is a server sending a temp error page or a proxy doing the same.
  • edited August 2014
    @Sven maybe a counter or something for not found URLs.

    That will do the trick. Like delete if link couldn't be found on 3rd check or like if link couldn't be found on 3 days in a row.

    That will make sure most of the "still live" links stay within SER
  • Whatever you decide @Sven as long as we have a way of recovering (or not losing) these links that end up in the failed re-verification folder.

    One thing. It took me a lot of time to go to each project and manually re-verify the failed re-vevified.
    It would be good if we had an option to highlight all projects>verify all failed-re-verified.

    Otherwise the software works very well for me and using @Ron 's lists I have been able to build lots of links.
    Thanks

  • ronron SERLists.com
    @derdor - This issue happens in all multithreaded software that checks links. I was going nuts with this issue with Scrapebox when it first came out years ago. SB (as well as SER) gets a bunch of false negatives as well. It's all caused by line noise, server timeouts, etc.

    I reached a point where I needed to decide whether it was worth it to chase my tail after about 10% of the links that seem to die and reappear all the time. I finally concluded that it wasn't worth it...that I was building links in massive amounts...that I just needed to focus on the bigger picture.

    Anyway, I just thought I would throw that in there as this was a topic that bothered me for a long time with multiple products, lol.  
  • Good to know this is not just my problem but for me it's more like 20-30% of links.
  • I'm with @ron, I too concluded it's not worth the time and resources, and I disable the re-verification option in my T1 projects.

    But, if @Sven wants to improve this re-verify option, here's my suggestion (somewhere in-line with what @maree said).

    If the re-verification fails, instead of deleting the failed urls from the project's Verified list, just mark them as "Dead", and do not count them in the Verified column, and do not build links to them.

    Then occasionally (once a week), we can set the project(s) status to Active (Re-Verify Dead). If the "Dead" url is found to be alive, set it to "Alive", and count them in the Verified column, and continue to build links to them...

  • ron
    maree
    Olve1954
    Same goes for me. In general it is like %25, i am also not re-verifiying all over, i just did experiment on couple of projects that i suspected. The problem is, this can be fixed. Wİth some coding by @Sven as i suggested or like
    Olve1954 suggested.

    Dead links can be moved to a group as "dead links" and we can manually or auto verify them.

    What i am asking is just don't delete a link when one time it is not found. Just let us check 1-2 times more.
    %25 is huge btw. If you are considering the indexing rate, you can uderstand that this %20-%25 is even higher.
  • edited August 2014
    Here's a better idea,

    If the re-verification fails, just mark them as "Dead", and do not count them in the Verified column, and do not build links to them.

    In the project's Option, add an option to "Re-verify Dead URLs"

    image

    If the "Dead" url is found to be alive, set it to "Alive", and count them in the Verified column, and continue to build links to them.

    If we really need to delete these "Dead" urls,

    Show URLs -> Verified, filter "Dead", and press the delete key.

    Or maybe we should have another column "No. of Failed Attempts" (i.e. no. of times URL found to be dead), then @derdor can manually delete his URLs if  they have failed 3 times or more...

Sign In or Register to comment.