Skip to content

♦♦♦ Beta Testers Needed For Stand Alone Identify And Sort In ♦♦♦

Hey everyone, 

I'm looking for around 5 people to beta test the stand alone identify and sort in before final release. 

Here's a screenshot:

image

Some features:

  • Ability to run multiple identify and sort-in campaigns simultaneously.
  • Identify and sort in based on keywords in meta data, <title>, visible text, etc.
  • Ability to monitor a folder in real time so as your external scraper is scraping (E.G. Scrapebox/Gscraper) PI will automatically read from the output file and identify and sort in the URLS. 
  • Easily export projects to .SL
  • Identify custom platforms
I'm only looking for people that have the time to really test and provide feedback on bugs, improvements, etc. If you don't have the time, please don't apply.

If you're interested, please post in this thread and list what operating system you're able to test it on. Once I have 5 people picked out, I'll start a private chat with the 5 selected members and we can discuss improvements/bugs there and this thread will be closed.

Thanks!


«13

Comments

  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    Nice work Devin, best of luck :)
  • s4nt0ss4nt0s Houston, Texas
    @BanditIM - Thanks :)
  • s4nt0ss4nt0s Houston, Texas
    Added a basic overview video to show how it works: 


  • I can test on Server with Windows 2008 R2 OS.
  • s4nt0ss4nt0s Houston, Texas
    @jpvr90 - Ok sounds good. Will PM you.
  • I can test
  • I am using ser a lot at the moment. Not at the velocity of most users but I babysit the vps and local install alot and would be able to give feedback. pm me if you need a tester
  • Also have a Dedicated Server which is not loaded now, so can test. Besides have a big list of different links, which are not identified through GSA.
  • sagarpatilsagarpatil 1LinkList Ninja
    I saw the preview. Looks great. I run GSA SER on Windows Server 2008 R2. I'm available on Skype everyday for 15 hours and can provide you feedback:)
    I'd love to be a beta user.
  • s4nt0ss4nt0s Houston, Texas
    Ok, will PM all of you today after I get back from boxing. Thanks for the help. :)
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    Would love to check this out man. We've already got data pouring in, and have a similar process after we scrape urls, count me in if you've got spots left ;)
  • s4nt0ss4nt0s Houston, Texas
    @1linklist - of course you're in. :)
  • Hello, i can test, windows server 2008 r2 :)
  • edited October 2014
    Hi Devin,

    I currently have GScraper running on Dedi from GreenCloudVPS. I would like to be a beta test for this tool. I usually split my scraped URLs into dummy projects to autimatically sort but I wanted to see the difference with the link count if i use your software.

    Looking forward to beta testing this.

    PS: While watching the video, I noticed that there's no option to use proxies. Might as well add it as one of the features..
  • edited October 2014
    Looking forwards to this
  • s4nt0ss4nt0s Houston, Texas
    Hey, 

    We have more than enough beta testers for now. Thanks guys, if we need more people to test, I'll shoot you guys a message. Thanks

  • I was the one who requested this feature way back when. Can I beta test?
  • **Feature request**
    Gscraper can save harvested urls to one txt file. Ser identifier could read and delete this file, remove duplicated domains/urls every X minutes and sort + identify platforms. Lets say i tell Gscraper to save harvested urls to harvested.txt file. When i delete it Gscraper creates new file in seconds so this dont disrupt harvesting process. This could easily automate Gscraper + Gsa combo. Without removing duplicates while sort and identify process we will end with very big identified file with tons of duplicates.

    Or your ser identifier could remove duplicates from all files in identified folder every Y minutes.

    Or your identifier could process only new domains harvested by Gscraper (identified domains saved in 1 file - processed.txt - if domains exist in processed.txt identifier skips it). This would be best, becuase in theory for some footprints Gsraper will give you tons of same results, so you sort and identify same websites again and again. In result (in theory) if you sort and identify multiple times same websites on same host it can look like ddos or something similar, and this activity can triger automated abuse message.
    Just my 2 cents ;)
  • Hit me up if you need another tester. I run many projects and can give good feedback and suggestions. Not just looking to get something free
  • Wow just watched the video this bring control over gsa ser campaigns to a whole new level!!
  • haryonoharyono in your heart
    similar like RankCracker right?
  • s4nt0ss4nt0s Houston, Texas
    @satyr85 - Thanks for the suggestions :)

    @backlinkaddict - will do.

    @haryono - Not really, there is a lot more to this tool.
  • What does RC have to do with this.. can someone say X-rummer?
  • Hey devin any idea when pi will be released and completly seperate from gsa?
  • fngfng
    edited November 2014

    How much will this cost?

  • s4nt0ss4nt0s Houston, Texas
    @backlinkaddict - Well the only thing stopping it from being released now is just fixing any bugs we see that are reported from beta testers. 

    @fng - Not sure yet :/
  • @s4nt0s Are you still taking beta testers?
  • s4nt0ss4nt0s Houston, Texas
    @Clintbutler - I'm good on beta testers for now, but if I need more I'll shoot you a PM. Thanks
  • @s4nt0s Great work buddy! I have a list with more than 10 million urls from my gsa campaigns so far and I want to put them in your software to sort them and to remove the dead ones. Is it possible to be a beta tester or to buy a copy of your software. Thanks in advance...
  • s4nt0ss4nt0s Houston, Texas
    @georgikz5 - Thanks, if I need more testers I'll let you in. I already let in more than I needed due to PM's, but for now we're good. The software should be launched pretty soon (aiming for 1-2 weeks max).


Sign In or Register to comment.