Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Platform Identifier Constantly Crashing.

I can't figure out why my PI is CONSTANTLY crashing now. It ran a 3.5M Scrape just fine the other day. Now i'm trying to run one that I split to only 1M and any random amount of time it just completely hangs and can only be closed by CTRL+ALT+DELETE which totally kills your current status. I've tried randomizing the urls before loading to PI. I've tried trimming URLS longer than 150 characters. I've tried all kinds of shit.

This app very very badly needs an auto-save function like SER and the ability to use proxies. When it keeps crashing I don't wanna have to keep hammering the same damn domains over and over and get a abuse report or 403 etc.

Currently my only way of restarting where my lists left off is doing only 1 at a time. When it crashes import all the platforms it recognized and the unrecognized to ScrapeBox. Remove Duplicate Domains. Select the original Scrape file and use "select url list to compare" function in ScrapeBox against all my platforms recognized and unrecognized, which is slow as shit with huge files. ScrapeBox 2 ain't any better at it either with it being 64bit.

EDIT: I've also tried running compatibility mode. Don't matter. I've tried reducing threads drastically. Tried ALOT. Thinking it's a problematic url or urls but no way to identify it that I can think of.

EDIT2: I'm running this only PI on a separate DEDI where SER isn't running. Ideas?

Comments

  • s4nt0ss4nt0s Houston, Texas
    edited January 2015
    I'm going to send you the latest update that isn't available yet that addresses some issues. You might consider right clicking on a project and backing it up sometimes so you have a backup of where it left off just in case of any issues. You can restore the backup the same way SER works.

    We're adding auto save like SER, it's just a bit tricky because of the way it works.

    Also, if you can PM me that 1M list so I can run it through on my end, that would be great so I can see if there's some strange characters or something causing problems.

    Going to PM you the new version now.
  • Thanks for swift reply. I'm going to try the latest version. I replied to your PM.
  • edited January 2015
    @s4nt0s - 4.5 hrs strong without a crash with the version you sent me. Got estimated 14 hours to go at 150 threads and 2.5Mish urls left to identify.

    Will post an update when I get up and get online tomorrow.
  • s4nt0ss4nt0s Houston, Texas
    Awesome. I actually have a new version with auto-save added, but I'm testing it a bit on my end first :)
  • @s4nt0s - I processed the entire list without any crashes. At the end it wouldn't close out a few threads but when I paused it I was able to export the sitelist etc just fine. Not that it would matter since the the urls were already saved by engine to my specified folder.

    The auto-save sound great. Definitely a needed feature. Hopefully proxy support can be added soon too.
  • s4nt0ss4nt0s Houston, Texas
    Sent you the new version with auto-save ;)
  • Thanks. I'll let you know if I have any issues with it.
Sign In or Register to comment.