Skip to content

♦♦♦ Beta Testers Needed For Stand Alone Identify And Sort In ♦♦♦

Hey everyone, 

I'm looking for around 5 people to beta test the stand alone identify and sort in before final release. 

Here's a screenshot:

image

Some features:

  • Ability to run multiple identify and sort-in campaigns simultaneously.
  • Identify and sort in based on keywords in meta data, <title>, visible text, etc.
  • Ability to monitor a folder in real time so as your external scraper is scraping (E.G. Scrapebox/Gscraper) PI will automatically read from the output file and identify and sort in the URLS. 
  • Easily export projects to .SL
  • Identify custom platforms
I'm only looking for people that have the time to really test and provide feedback on bugs, improvements, etc. If you don't have the time, please don't apply.

If you're interested, please post in this thread and list what operating system you're able to test it on. Once I have 5 people picked out, I'll start a private chat with the 5 selected members and we can discuss improvements/bugs there and this thread will be closed.

Thanks!


Comments

  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    Nice work Devin, best of luck :)
  • s4nt0ss4nt0s Houston, Texas
    @BanditIM - Thanks :)
  • s4nt0ss4nt0s Houston, Texas
    Added a basic overview video to show how it works: 


  • I can test on Server with Windows 2008 R2 OS.
  • s4nt0ss4nt0s Houston, Texas
    @jpvr90 - Ok sounds good. Will PM you.
  • I can test
  • I am using ser a lot at the moment. Not at the velocity of most users but I babysit the vps and local install alot and would be able to give feedback. pm me if you need a tester
  • Also have a Dedicated Server which is not loaded now, so can test. Besides have a big list of different links, which are not identified through GSA.
  • sagarpatilsagarpatil 1LinkList Ninja
    I saw the preview. Looks great. I run GSA SER on Windows Server 2008 R2. I'm available on Skype everyday for 15 hours and can provide you feedback:)
    I'd love to be a beta user.
  • s4nt0ss4nt0s Houston, Texas
    Ok, will PM all of you today after I get back from boxing. Thanks for the help. :)
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    Would love to check this out man. We've already got data pouring in, and have a similar process after we scrape urls, count me in if you've got spots left ;)
  • s4nt0ss4nt0s Houston, Texas
    @1linklist - of course you're in. :)
  • Hello, i can test, windows server 2008 r2 :)
  • edited October 2014
    Hi Devin,

    I currently have GScraper running on Dedi from GreenCloudVPS. I would like to be a beta test for this tool. I usually split my scraped URLs into dummy projects to autimatically sort but I wanted to see the difference with the link count if i use your software.

    Looking forward to beta testing this.

    PS: While watching the video, I noticed that there's no option to use proxies. Might as well add it as one of the features..
  • edited October 2014
    Looking forwards to this
  • s4nt0ss4nt0s Houston, Texas
    Hey, 

    We have more than enough beta testers for now. Thanks guys, if we need more people to test, I'll shoot you guys a message. Thanks

  • I was the one who requested this feature way back when. Can I beta test?
  • **Feature request**
    Gscraper can save harvested urls to one txt file. Ser identifier could read and delete this file, remove duplicated domains/urls every X minutes and sort + identify platforms. Lets say i tell Gscraper to save harvested urls to harvested.txt file. When i delete it Gscraper creates new file in seconds so this dont disrupt harvesting process. This could easily automate Gscraper + Gsa combo. Without removing duplicates while sort and identify process we will end with very big identified file with tons of duplicates.

    Or your ser identifier could remove duplicates from all files in identified folder every Y minutes.

    Or your identifier could process only new domains harvested by Gscraper (identified domains saved in 1 file - processed.txt - if domains exist in processed.txt identifier skips it). This would be best, becuase in theory for some footprints Gsraper will give you tons of same results, so you sort and identify same websites again and again. In result (in theory) if you sort and identify multiple times same websites on same host it can look like ddos or something similar, and this activity can triger automated abuse message.
    Just my 2 cents ;)
  • haryonoharyono in your heart
    similar like RankCracker right?
  • s4nt0ss4nt0s Houston, Texas
    @satyr85 - Thanks for the suggestions :)

    @backlinkaddict - will do.

    @haryono - Not really, there is a lot more to this tool.
  • fngfng
    edited November 2014

    How much will this cost?

  • s4nt0ss4nt0s Houston, Texas
    @backlinkaddict - Well the only thing stopping it from being released now is just fixing any bugs we see that are reported from beta testers. 

    @fng - Not sure yet :/
  • @s4nt0s Are you still taking beta testers?
  • s4nt0ss4nt0s Houston, Texas
    @Clintbutler - I'm good on beta testers for now, but if I need more I'll shoot you a PM. Thanks
  • @s4nt0s Great work buddy! I have a list with more than 10 million urls from my gsa campaigns so far and I want to put them in your software to sort them and to remove the dead ones. Is it possible to be a beta tester or to buy a copy of your software. Thanks in advance...
  • s4nt0ss4nt0s Houston, Texas
    @georgikz5 - Thanks, if I need more testers I'll let you in. I already let in more than I needed due to PM's, but for now we're good. The software should be launched pretty soon (aiming for 1-2 weeks max).


  • @s4nt0s got you man, I am willing to pay for a copy now if it's possible. I really need it to sort my lists faster... Thanks
  • royserparoyserpa (URL and Domain Deduped!) GSA SER VERFIED Lists! => http://gsa-verified-list.royserpa.com/
    @s4nt0s pretty interesting stuff pal

    Perhaps there could be some discounts to owners of a gsa ser license? :)
  • s4nt0ss4nt0s Houston, Texas
    @georgikz5 - PM'd you.

    @royserpa - There should be a small discount for everyone like the other sales threads.
  • royserparoyserpa (URL and Domain Deduped!) GSA SER VERFIED Lists! => http://gsa-verified-list.royserpa.com/
    @s4nt0s

    Alright!

    Perhaps the ones that have other GSA software can get a deeper discount? :)
  • s4nt0ss4nt0s Houston, Texas
    @royserpa - Sorry, won't be additional discounts other then that right now.
  • royserparoyserpa (URL and Domain Deduped!) GSA SER VERFIED Lists! => http://gsa-verified-list.royserpa.com/
    @s4nt0s hehe alright bro.
  • What is the difference between your app and Sick Submitter's free Platform Reader?
  • s4nt0ss4nt0s Houston, Texas
    edited November 2014
    @fng - I downloaded sick platform reader, but is there a paid version? The version I found only allows 25 threads, loading one file at a time, single project at a time, etc. If that's how it is they are way different :)
  • What is difference between SERs built in Platform identifier and this one, except that it can sort platforms from multiple files and projects, which is big thing btw, is there something more ? 
  • s4nt0ss4nt0s Houston, Texas
    edited November 2014
    @katjzer - It can also sort URLS based on keywords in the <title>, URL, visible text on page, etc. It can monitor folders as well so lets say you have Gscraper or Scrapebox scraping URLS and outputing them into a folder, this can monitor the folder and sort them in real time by checking in intervals. I show that in the video. You can quickly export any project to a .SL file as well. You would really have to use it to see how convenient/fast it works.

    Other things I'm considering adding after release as a future update:

    1) Automatically remove duplicates while sorting (monitor folder)

    2) Sort by PA/DA using MozAPI

    I'll just have to see what people are wanting, but ya I haven't seen any other tool work like this one before. 
  • I want to buy it right now, depending on the price. Is it possible ?
  • s4nt0ss4nt0s Houston, Texas
    @volver - Sorry, not yet. I'd rather not start selling it until its ready.
  • "Sort by PA/DA using MozAPI"

    With updates to PR happening never again, this would be a great idea.
  • waiting for release :-?
  • s4nt0ss4nt0s Houston, Texas
    @lunaya - Will be out this month :)
  • s4nt0ss4nt0s Houston, Texas
    @backlinkaddict - No it doesn't do that right now, I've explained some of the features here in the thread. New features should be added after release.
  • When the tool is available?
  • s4nt0ss4nt0s Houston, Texas
    edited December 2014
  • You went live and didn't say anything? wtf
  • s4nt0ss4nt0s Houston, Texas
    @clintbutler - It just went live today lol. 
  • @s4nt0s - "GSA PI" lol. Excellent acronym & tool.
  • @s4nt0s

    I downloaded the trial, it seems easy enough to use BUT with only a small list (61K urls) it timed out when only a third of the way through. The system seems to give the impression that restarting the program will restart the job , but it looks like it went back to the beginning. If I want to process the rest of that file do I need to mess about with the input file to delete those urls already processed ?
  • s4nt0ss4nt0s Houston, Texas
    @OldFusser - Thanks :)

    @filescape - What do you mean it timed out? Do you mean runtime error? If so, that was a bug that was just reported and now fixed. New update should be pushed soon when Sven is back in the office.

    If you're processing folders, then it can restart where it left off, but if its monitoring a folder, there is no way for it to start from where it left off so that feature only works for processing files. If you see an issue, please shoot me a PM and let me know the details.
  • s4nt0ss4nt0s Houston, Texas
    Version 1.01 is available now and should fix the run time error. 
  • haryonoharyono in your heart
    @s4nt0s I tried the trial version and I like it but I think it would be more good if your software can scrape outbound link then filter as well.

    e.g sometimes I use trackback or blog comment verified urls list from GSA SER to scrape outbound link on it (as we know sometimes there many outbound links inside it) then I send to scrapebox link extractor addon to scrape all outbound links. The result I got 500k or more potential website list and send it to GSA SER.

    If GSA platform identifier can do it (scrape outbound link from verified urls -> filter/identify urls -> send to GSA) I think our job more saving (no need to buy more proxies to scraping urls), simple and fast. (just idea)
  • haryonoharyono in your heart
    here is sample i scrape from GSA SER verified trackback outbound links
    image

    Scrapebox link extractor

    image

    with this way I never buy much proxies to scrape new urls

  • s4nt0ss4nt0s Houston, Texas
    @haryono - Interesting feature suggestion. There will definitely be an outbound link filter added to where it only sorts certain URLS that meet your OBL requirement, but I never thought about scraping the outbound link URLS.

    It might be something that would be good to add to the "tools" button where we have dup remove and the other stuff.

    GSA PI is more of a filtering tool than a scraper, but that is an interesting idea.

    User imports URLS > PI scrapes outbound links on imported URLS and identifies and sorts them. I'll talk to the programmer about this. Thanks for the suggestion :)
  • haryonoharyono in your heart
    you can use tools in GSA SER "Automatically Export Verified Urls" but with more specified Verified urls (e.g urls with OBL more than xxx export to GSA PI). @Sven can do it I guarantee if GSA PI can do this it can increase your sales :D maybe ....
  • @s4nt0s I saw on your video that u use failed folder for submission, it is not good for long time because ther are also "failed" saved from SER i think is better option will be add one field more in GSA global lists.
    What you think about it?
  • edited December 2014
    or use identified folder maybe
  • s4nt0ss4nt0s Houston, Texas
    @rafi - Everything in the video was just an example, you can use any folder you want. :)
  • What you think about it? U still working with failed?
  • s4nt0ss4nt0s Houston, Texas
    @rafi - I usually just export the .SL from the project in PI and import it directly into SER like that. 
  • Maybe its time to work on this with SER, its possible? Like i say one additional field/folder for global list in SER resolve that main probem with Gscraper, SER and PI working together
Sign In or Register to comment.