Skip to content

I think I found the cause for low verified %

Hi guys,

As most of you know when i'm running global lists I've been getting a very low verified rate of between 2-5% vs 15-20% from a few months ago.  I've been struggling to fix this over the past few months.

I've spent a lot of time debugging and I think I found the cause and would be interested in other forumites comments on this.

I think the low verified is caused by endless resubmission to the SAME sites ONCE you have exhausted your global lists.  Basically the way SER works is UNLESS you clear your target url history, the next time it submits if it already has an account on that site it will use that account submit the second link.  My thesis is that lots of sites are blocking 2nd and 3rd submissions hence the low verified rate even though the submission goes through.

To test this I started a new copy of GSA with a CLEAN global list.  Left it running for 2-3 days to build up a very small global list.  Deleted duplicate urls from the global list.

Then I run ONE single project to isolate: changing status -> using global list only and left it running for a day.  I have "avoid posting url on same domain twice" UNCHECKED as per these forums to increase submissions on this project (it's a tier 2 project)

In theory what it's supposed to do is just submit to global lists and I would expect as it's using the site list to have 15-20% verified which is my normal verified rate when submitting with global site lists OFF.

What I have found is the verified rate is extremely low between 2-4%.  I think it's because GSA has already submitted to these sites by resubmitting again and again to these sites lots of them disallow multiple resubmission.

I invite you to test the same on your machine if you have been getting low verifies.  This problem exists over time when you proxies either get blocked and your scraping from search engines slows and you rely more on global lists.  it keeps resubmitting to the same sites over and over again and it doesn't get verified.

If my thesis is right the way around this would be every now (i.e. once each project has submitted to all global lists), u have to delete your target url history as WELL as the created accounts, put new emails addresses in (so the accounts actually get created) and then rerun the project.

I've done this and my %verified shoots through the roof again

@sven would be interested in your comments on this.

Comments

  • MrXMrX Germany
    edited July 2013
    I already thought of something similar so I started to clear cache/history/accounts once per month for every project and then feed them with new articles/content!
  • anyone else see a simliar low drop in verified for tier 2 as it keeps on resubmitting to the same list and creating dupes?
  • AlexRAlexR Cape Town
    I haven't tested it but will try and keep a lookout for it. 
  • My Tier 2 submissions are higher these days.
Sign In or Register to comment.