Skip to content

How do I deal with this dreaded "Already parsed" message?

Hey guys
I tried posting to a bunch of sites using GSA SER and came across a weird problem.
For most of the sites it said "already parsed" in the logs even though I had unchecked 'avoid posting to the same domain twice'

What exactly does 'already parsed' mean and how can I make it post to each and every URL in the list? (I used a custom target list, by the way).
Tagged:

Comments

  • The url had already parsed by SER. You must have been imported it and posted before this.
    You can delete target url history.
  • I did delete target URL history and target URL cache but still keep getting 'already parsed' messages.
  • Oh,I know.
    Your sites has a lot of duplicates.

    You can remove duplicates sites at first.
  • I removed duplicate URLs. But want to keep duplicate domains as removing duplication domains with shrink down my list a big deal.
    Isn't there any way around it?

  • :@)
    I'm sorry , I'm not good at English.
    I do not understand what you mean.
    You want only removed duplicate URLs and to keep duplicate domains ???  scrapebox can do it.

  • edited May 2013
    Okay, let me be a little more clear :)
    Here's what I did:

    1) Scraped lots of URLs.
    2) Removed duplicate URLs.
    3) Unchecked 'avoid posting to the same domain twice' option.
    4) Deleted target URL history and cache
    5) Started posting.

    And still I'm getting lots of 'already parsed' messages in the logs and GSA does not post on those URLs.
    My question is, how can I make it post on those sites? 

    Hope it makes more sense now.
  • I think maybe you should removed duplicate domains.
    because SER avoid posting to the same domain twice
  • I would look at your keyword list before you look anywhere else. Get yourself scrapebox or similar and get your own keyword list.
  • @stone1989715
    I don't want to remove duplicate domains.

    @NeilB74
    I'm using my own custom target list and am not using any keywords/search engines to scrape for targets.
  • @Sven, would appreciate your input.
  • I also have the same problem. I have a list of urls which I’ve imported. All duplicate URLs have been removed before hand. I run through this fresh list, and even though the project has never visited these urls before, it still says already parsed. I’ve noticed this a lot with the image gallery platforms. The thing is, when it does this, it’s actually doing this on urls where a link can be left, as there are links already on the page from other people, so I’m missing out on backlinks. Also, If I now run through the same list again, but this time enable ‘continuously try to post to a site even if failed before’, it now places the link. So why is this occurring on urls which haven't been visited before, and where a backlink could actually be left?
  • SvenSven www.GSA-Online.de
    Enable the option "Continuously try to post to a site even if failed before". This will ignore the already parsed message. Though @doubleup this can not happen. The message comes only up if the domain/url has been parsed before.
  • edited May 2013
    Thanks @Sven. I don't see that message anymore. Cheers for the great customer support :)

    Edit : For anyone who's going to try the above fix by Sven, be advised that it does slow the process down (thats pretty obvious, though). For me its worth it.
  • @Sven I know it's a weird one, and shouldn't happen, but I've definitely witnessed it. Just yesterday, i was playing around with a new project, disabled the ‘continuously try to post to a site even if failed before’ setting, and watched as the project skipped over image platform URLs, stating 'already parsed', when it had never seen them before as it was a brand new project. I then enabled the ‘continuously try to post to a site even if failed before’ setting, deleted the cache, reloaded the URL's, and the project now submitted to the image platform URLs.
  • spunko2010spunko2010 Isle of Man
    How would one go about deleting duplicate URLs? Export to Excel and sort...?
  • SvenSven www.GSA-Online.de
    URLs are added to history even if "continuously try to post to a site even if failed before" setting is enabled. So disabling it and you see the message all over. Also note that some engines add the whole domain as parsed as if just one location is expected to post to or the option is enabled to not post to that domain twice.
Sign In or Register to comment.