Skip to content

♦♦♦ Beta Testers Needed For Stand Alone Identify And Sort In ♦♦♦

2

Comments

  • s4nt0ss4nt0s Houston, Texas
    @royserpa - Sorry, won't be additional discounts other then that right now.
  • royserparoyserpa (URL and Domain Deduped!) GSA SER VERFIED Lists! => http://gsa-verified-list.royserpa.com/
    @s4nt0s hehe alright bro.
  • What is the difference between your app and Sick Submitter's free Platform Reader?
  • s4nt0ss4nt0s Houston, Texas
    edited November 2014
    @fng - I downloaded sick platform reader, but is there a paid version? The version I found only allows 25 threads, loading one file at a time, single project at a time, etc. If that's how it is they are way different :)
  • What is difference between SERs built in Platform identifier and this one, except that it can sort platforms from multiple files and projects, which is big thing btw, is there something more ? 
  • s4nt0ss4nt0s Houston, Texas
    edited November 2014
    @katjzer - It can also sort URLS based on keywords in the <title>, URL, visible text on page, etc. It can monitor folders as well so lets say you have Gscraper or Scrapebox scraping URLS and outputing them into a folder, this can monitor the folder and sort them in real time by checking in intervals. I show that in the video. You can quickly export any project to a .SL file as well. You would really have to use it to see how convenient/fast it works.

    Other things I'm considering adding after release as a future update:

    1) Automatically remove duplicates while sorting (monitor folder)

    2) Sort by PA/DA using MozAPI

    I'll just have to see what people are wanting, but ya I haven't seen any other tool work like this one before. 
  • I want to buy it right now, depending on the price. Is it possible ?
  • s4nt0ss4nt0s Houston, Texas
    @volver - Sorry, not yet. I'd rather not start selling it until its ready.
  • "Sort by PA/DA using MozAPI"

    With updates to PR happening never again, this would be a great idea.
  • waiting for release :-?
  • s4nt0ss4nt0s Houston, Texas
    @lunaya - Will be out this month :)
  • s4nt0ss4nt0s Houston, Texas
    @backlinkaddict - No it doesn't do that right now, I've explained some of the features here in the thread. New features should be added after release.
  • When the tool is available?
  • s4nt0ss4nt0s Houston, Texas
    edited December 2014
  • You went live and didn't say anything? wtf
  • s4nt0ss4nt0s Houston, Texas
    @clintbutler - It just went live today lol. 
  • @s4nt0s - "GSA PI" lol. Excellent acronym & tool.
  • @s4nt0s

    I downloaded the trial, it seems easy enough to use BUT with only a small list (61K urls) it timed out when only a third of the way through. The system seems to give the impression that restarting the program will restart the job , but it looks like it went back to the beginning. If I want to process the rest of that file do I need to mess about with the input file to delete those urls already processed ?
  • s4nt0ss4nt0s Houston, Texas
    @OldFusser - Thanks :)

    @filescape - What do you mean it timed out? Do you mean runtime error? If so, that was a bug that was just reported and now fixed. New update should be pushed soon when Sven is back in the office.

    If you're processing folders, then it can restart where it left off, but if its monitoring a folder, there is no way for it to start from where it left off so that feature only works for processing files. If you see an issue, please shoot me a PM and let me know the details.
  • s4nt0ss4nt0s Houston, Texas
    Version 1.01 is available now and should fix the run time error. 
  • haryonoharyono in your heart
    @s4nt0s I tried the trial version and I like it but I think it would be more good if your software can scrape outbound link then filter as well.

    e.g sometimes I use trackback or blog comment verified urls list from GSA SER to scrape outbound link on it (as we know sometimes there many outbound links inside it) then I send to scrapebox link extractor addon to scrape all outbound links. The result I got 500k or more potential website list and send it to GSA SER.

    If GSA platform identifier can do it (scrape outbound link from verified urls -> filter/identify urls -> send to GSA) I think our job more saving (no need to buy more proxies to scraping urls), simple and fast. (just idea)
  • haryonoharyono in your heart
    here is sample i scrape from GSA SER verified trackback outbound links
    image

    Scrapebox link extractor

    image

    with this way I never buy much proxies to scrape new urls

  • s4nt0ss4nt0s Houston, Texas
    @haryono - Interesting feature suggestion. There will definitely be an outbound link filter added to where it only sorts certain URLS that meet your OBL requirement, but I never thought about scraping the outbound link URLS.

    It might be something that would be good to add to the "tools" button where we have dup remove and the other stuff.

    GSA PI is more of a filtering tool than a scraper, but that is an interesting idea.

    User imports URLS > PI scrapes outbound links on imported URLS and identifies and sorts them. I'll talk to the programmer about this. Thanks for the suggestion :)
  • haryonoharyono in your heart
    you can use tools in GSA SER "Automatically Export Verified Urls" but with more specified Verified urls (e.g urls with OBL more than xxx export to GSA PI). @Sven can do it I guarantee if GSA PI can do this it can increase your sales :D maybe ....
  • @s4nt0s I saw on your video that u use failed folder for submission, it is not good for long time because ther are also "failed" saved from SER i think is better option will be add one field more in GSA global lists.
    What you think about it?
  • edited December 2014
    or use identified folder maybe
  • s4nt0ss4nt0s Houston, Texas
    @rafi - Everything in the video was just an example, you can use any folder you want. :)
  • What you think about it? U still working with failed?
  • s4nt0ss4nt0s Houston, Texas
    @rafi - I usually just export the .SL from the project in PI and import it directly into SER like that. 
  • Maybe its time to work on this with SER, its possible? Like i say one additional field/folder for global list in SER resolve that main probem with Gscraper, SER and PI working together
Sign In or Register to comment.