Skip to content

Platform Identifier Global Settings

Hi

I purchase Platform Identifier today, and run scape lists from scapebandit now 500K per list onwards, what's  the best settings in the global settings.

set simultaneous threads to 700

Request timeout seconds                          (now 60)
Request retry count                                   ( 2)
Maximum size of website to download      (40)

I play a little with the settings but get diffrent results (in sorted by platforms),if i use first the same list imported in the IPlatform ndentifier and later a second time in GSA SER to import sites under options/advance/tools/import URLs (identified platforms and sort in) watch the diffrent results.

get more unrecognized now in Platform Identifier, so what is the best settings for speed and results sorted by platforms

Comments

  • s4nt0ss4nt0s Houston, Texas
    Default settings should be fine, but 40MB download size for websites is pretty big. Are you also running SER at 700 threads when running through the list? 

    The results are not going to be exactly the same but should be pretty similar. When you create a project you can enable the "deep matching" option which can help improve detection rate.
  • Thank you for quick answere

    I run the indentifier on a second computer, and not submit to indentified GSA folder in real time (good feature in the Indentifier), now i export .sl files (I plan to try export in real time in dropbox folder?, dont know if that works, and from dropbox on server into the GSA). My GSA is runing on server, I am not want to lower the my LPM on the same machine, runing two softwares on same time. I have enough recouses on my computer for the Identifier.


    Lowered now download size from website to 15MB
    I try the "deep matching" short but that is a little to much with that threads, i used that computer for some other works to. What i found out today it depends on scape lists to scaped list with better footprint makes more results sorted by platform.

    I want say great product save a lot of time, the main factor is not to stop GSA from runing and lost hours to import bigger files in. I use identified lists with 3millons sites or more in GSA. With ScapeBandit you have the option now to have millions new targets per day if you know how to search for it. So your product will be a perfect tool to handle bigger scraper lists.

    If you lost GSA running on a server for 1 week per month, only importing big lists with millions targets, some hours each day will not fine and costs money.

    One option will be fine maby in feature (clean up and removing non working from sitelist) import .sl and clean?
  • s4nt0ss4nt0s Houston, Texas
    Can you clarify what you mean? You're wanting to be able to dropbox from PI on one server to GSA folder on another server? Currently, sorted files would have to be transferred between servers manually, but maybe that can be added as an update.

    I'll see about adding clean up feature as well. 
  • Yes i wanting to able to dropbox from PI when PI is running on other server or Computer and sort the scrape list, GSA and dropbox is installed on another server and can read from the folder you share in your dropbox.

    This would be a great feature! Most users have not the best servers but a home computer to and a VPS destination where GSA is running, or you can use it for several VPS where GSA is running and the read all from one folder where you have the fresh target sites shared in you dropbox account.

    The clean up function will be great feature to! Nothing is better than allways a fresh list where all dead sites are delated.

    I think when PI delivered all those features its a must have tool for GSA PRO users the want GSA working in the best way.
Sign In or Register to comment.