Skip to content

Frequent Platform Identified Crashes


Please I recently started using GSA Platform Identified and it has not been easy working with the software.

Within a day, it can randomly crashed and simply gives the below information:

GSA Platform Identified has stopped working
A problem caused the program to stop working correctly.
Windows will close the program and notify you if a solution is available.

Please, what could be the problem and how do I solve it because it make the whole process of automation useless. It is meant to Monitor a folder that Scrapebox Automator exports its harvested URL into. Then it is to identify the platform and put into SER Identified folder.

However, this becomes really frustrating as you cannot tell when Platform Identifier will crash, so you have to always be checking back on server every now and then.

Please how do I resolve this issue.



  • SvenSven
    A random crash is always hard to debug. Do you have any situation that I can try to reproduce it? 
  • I have it set to also create a job to remove duplicate and my suspicion is that it when this function is call upon that the incidence often happens.

    I basically leave it to run without attending to it and then when I check back it might have crashed.

    I can't really lay my hand on anything concrete that might be causing it to happen but its frequency is simply annoying. My SER setup basically depends on it to have Identified links.

    Let me turn off the created Remove Duplicate function for a while and see what happens if there would be any improvements.
  • s4nt0ss4nt0s Houston, Texas
    So you have one project that is identifying URLS and then another dup remove project that is removing dups from that identified folder?

    I can try to test on my end as well.
  • prnichesprniches Lagos
    edited October 2019
    I turned on the the "Create project toremove duplicates" feature in the Settings of the program.

    Basically, Scrapebox removes duplicate URLs and exports to a folder, PI then monitors that folder and identifies the platforms into their individuals files, then the Remove Duplicate features checks the identified platform for duplicates against the SER Identified folder. That is how I believe it works at the moment.

    Since I turned off Remove Duplicate feature for some time now, it has virtually not crashed. Leading me to strongly believe it is connected to Remove Duplicate project feature.

  • HELP!      HELP!!     HELP!!!

    There is this issue that is even morte worrisome than the crashing issue - it virtually does not recognize any URL even from a 50K deduped Scrapebox scrape - not oven one URL recognized.

    When I was first using it, it would display some statistics at the bottom right corner of the screen and there wold usually be some significant amount of data shown in the Total downloaded: line, but now it is just at 0KB all the time.

    Now all it does is just keep counting time elapsed and remaining while increasing the number of unrecognized URLs

    The last stats show like: 2,223 URL/min | 0KB/s

    What could be wrong please!!! It's holding back my work so much at the moment.

    Please help!

  • I tried running some of the smaller files directly in SER to see if actually there was platforms in the lists.

    The problem is more indeed worrisome because running the same list directly in SER with the "Import URLs (identify platform and sort in)" give me well over 85% platform identification rate while Platform Identifier itself say it did not recognize any URL.

    Please I need to really sort this out as I'm frankly confused now. Platform Identifier was supposed to effectively handle this feature and here SER is virtually doing it better than it.

    Thanks for your timely assistance.
  • s4nt0ss4nt0s Houston, Texas
    edited October 2019
    @prniches - That is really odd. Something similar happened to users back when there was a bad DDOS and it triggered some licensing issues. I'm assuming you're on the latest version 1.97?

    I've sent you a PM.

Sign In or Register to comment.