Skip to content

ScraperBandit - Beta Testing Open To Public

BlazingSEOBlazingSEO http://blazingseollc.com/proxy
edited December 2014 in GSA Search Engine Ranker
*Edit* - ScraperBandit Service is now fully up and running! Check out the official sales thread for more details: https://forum.gsa-online.de/discussion/15247/scraperbandit-the-only-scraping-system-you-ll-ever-need?new=1

After 8 grueling months, 5 developers, and thousands of dollars spent (and wasted), we are FINALLY able to release a beta version to the public for ScraperBandit! For detailed information on what this system is about and previous interest, please refer to this thread:

https://forum.gsa-online.de/discussion/10810/new-google-scraper-about-to-come/p1

Note: The Footprint system is not quite ready. We wanted to get the scraping part out as fast as possible and want to get that perfected first.


In a nutshell, this system will replace your current Google scraping servers at a much lower cost. People argue that they only pay a one time fee for their GScraper/Scrapebox, but what they don't count in their expenses are their servers and proxies to run these software's. Our system handles all the server usage, all the proxies, and most importantly the link handling. Also, would you rather be able to scrape 50 million total links in a month with your current setup, or 50 million links in a single day using our system? Your answer to this question depends on how much you want to increase your link building in GSA SER.

In our current beta version we allow FTP and DROPBOX upload which gives you the opportunity to constantly append to a certain text file in your Dropbox folder so that GSA SER can continually read that updated site list. How powerful is that?! Forget GSA scraping for you, forget having to constantly transfer over files from ScrapeBox/Gscraper to your sitelist file --- our system will handle it for you automatically!

The starting price (tentatively) will be $0.0000005 per link, which is $5 for 10 million scraped links. This price MAY change as we see our LPM increase/decrease according to the footprints given by people. However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!

To be a beta tester, you must give an honest review of the system once you've given it a test. I don't care if it's bad or good, as long as you're honest (and not a competitor of course). If you do not give a review after testing, we will unfortunately have to ban you from the system for further usage. Taking 2 minutes out of your day to post a review is not much to ask for 24 million links FREE each day, so please help us out!

Simply post below if you are interested and I will PM you with details on how to sign up for your free account.

Thank you so much for the continued support of BanditIM everyone, I really appreciate it!

- Neil Emeigh
BanditIM Admin

Comments

  • donaldbeckdonaldbeck Advanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
    @BanditIM

    I'll test it and leave you an honest review.
  • I am interested in this service and will leave a review.  Please PM me the details.  Thanks
  • Absolutely interested, and will definitely leave a review.  This is exactly what I need!
  • Would be glad to test a new one scraper and leave my reviews.
  • Interested, would give a good feedback what features to add and would leave a honest review too.
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    Surely interested, feedback is essential in order to improve - hence no problem in my end.
  • I'm using gscrapper for past few years and would definitely be interested in service like yours, i'd like to get the beta access if possible :).
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    PMs sent out to everyone, enjoy :)
  • Hi Neil,

    Definitely interested, count me in.
  • Tim89Tim89 www.expressindexer.solutions
    I've had the opportunity to test this out already and it does exactly what it is meant to do.

    This service literally takes away your scraping work, it cuts costs for running a scraper + proxies, VPS, resources in general.

    It's a set and forget / plug n play system, it'll continuously scrape you targets, the only thing you've got to make sure of, is to have the resources to process these lists :).

    Nobody should have to purchase lists anymore with something like this, this has taken the tedious task of scraping and cuts costs dramatically, I mean, $5 for 10,000,000 URL's, with some of the list sellers selling lists for around $50 each for upto 100,000 links... You can obtain 50,000,000 potential targets for that price alone.

    All you need is keywords + footprints.

    Great work @BanditIM‌ for creating such an innovative piece of online software, all the best!


  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    Great work and thank you for allowing a beta-test in terms of me.

    Intro Review:

    The interface is Very user friendly and easy to navigate.
    Actually - even an idiot are able to navigate and use the GUI - without loosing direction.

    Naturally some more descriptions could be added, and tips from me will follow.

    Problem 1:

    I experienced problems when I tried to manually paste the footprints and keywords into the forms.
    After 2 attempts, I realized that there had to be a problem, so I switsed and used upload files.

    Uploading of the 2 txt-files, containing the footprints and keywords seems to work flawless.

    Currently I wait for completion, and in my case I have chosen to have the list delivered directly into dropbox.

    I have refreshed the 'results' a few times and so far it only shows in process. Hopefully this will show in percentage how much is left, time will tell.

    The actual review will come later.


  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    edited November 2014
    Intro Review Part 2:

    WOW - The results got delivered within 10 min directly into my dropbox as a zip-file.
    Even got an email: "Your google scrapping campaign has finished"

    Now that is surely something:D Let's try to dedupe the files and run it in GSA SER afterwards.

    Update:
    Actually the files inside the zip-file contains three versions:
    1. Full list.
    2. Unique domains
    3. Unique URL

    Awesome!

    So far i'm pretty impressed by the speed and easy navigation!

    After running them through the Cannon, I will make yet another follow-up - Stay Tuned!
  • edited November 2014
    The interface is clean and easy to use.  One issue, is that it limits the number of keywords / footprints you can put in based on generating 24,000,000 links.  The problem is, not every keyword is going to have the max # of links, so you end up getting less than expected.

    My run was supposed to get just under the daily 24 mil, here are the results:
    40,603 TOTAL urls scraped
    27,530 unique urls
    7,077 unique domains

    Now, I can still do more scraping, as I didn't actually use the 24 mil links, but that means getting back in there and starting new campaigns until I use it up.

    It would be easier if it let you set up a monstrous campaign that could last weeks, and just processes it in batches to stay within the daily limit.


    Also, it appears the footprints did not work at all, so I doubt I will be able to build many, if any links with these urls.  There were a ton from Amazon and Craigslist for example.

    Once the footprint system is working, with a few tweaks - this thing is going to be a monster!
  • I am interested in testing this new service
  • magicallymagically http://i.imgur.com/Ban0Uo4.png

    As a beta-tester I have allready submitted a review of two parts of this product:

    1. The GUI/Interface
    2. The functionality inside the interface

    Both seems to be very well made, except the minor mentioned problem with manual submission of the footprints and keywords.
    Uploads of the files works flawless.

    Part 3 is about the actual result:

    Obviously the end-result is currently not optimal of the following reassons:

    1. Too many noFollow links (84%) + Too few verified (That is of course caused by other parameters such a GSA Ser itself + the footprints/keywords)
    2. Not sharp aimed and targeting according to the footprints.

    What I do like about this, is that we are not facing the problem with 'allready parsed' which seems to happen over and over again, when Gscraper is used.
    This Scrape is more 'clean' - unfortunately not optimized 100% according to the submitted footprints.

    My lpm were 25 running only 100 threads.
    I was able to get a list with 41.900 unique urls to run through GSA SER.

    Those result-links were clearly not the ones, I was trying to target according to the submitted footprints


    We have to realize that it doesnt matter if we are able to pull 100 millions links, if those are not targeted 'sharp as chili' according to the submitted footprints.

    In order to really bring customer value - those footprint-aiming must be improved - not just a little, but superb!!!

    I hope to be able to run yet another test, where those footprints are more tightned up.

    Overall the product is innovative and it looks promising so far, but bare in mind that the actual outcome/result must be of a higher quality.
  • mirenmiren Macedonia
    Looks promising. Will like to try this
  • Hi @BanditIM , can you please check my PM ? I didn't get the confirmation email for registration at all.
  • GREAT News - I was searching to get started with 3 dedicated servers with GSCraper and Additional proxies services add on subscriptions from GSCraper and I got yours thread link from my partner he said grab the beta test spot and get started because I am looking for at least 12+ months subscription so,.. If it will work what you described then HONEST review coming yours way - Thanks for such a great service development and I want to signup as BETA tester, PLEASE if you don't mind send me access..

    Have a nice day and WE wish you great wishes!!! :)
  • Where can I sign up?
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All new PMs sent out for access.

    @Tim89 - Thanks so much for the fast and thorough review pal, glad you liked it. Still waiting on a PM back regarding Yahoo/Bing interest ;)

    @magically - Wow wow, thanks so much for the thorough review for everybody! We are looking into the copy/paste bug right now, as another user seems to be having similar problems with it. For now, everyone should just Upload the files - that seems to be working fine.

    Regarding results - I have to remind everyone that it all comes down to the footprints and keywords you give :). Sure, you could scrape millions of results using plain keywords with no footprints, but how targeted are they going to be for GSA usage? - not very. We do plan on adding features later on that sort the lists by nofollow, as well as identify and sort the lists for GSA to use, but right now it's all up to what you feed the system.


    @almostbasic - I completely agree about the monster campaigns. We have it on our to-do, it just wasn't top priority so that we could concentrate getting the base system out first. Regarding the 24 million links, let me clarify what's happening:

    We charge people 1000 link credits for each search term they submit. You're right, each search term will not get the full results, so when your order is sent back to us what we actually do is credit BACK the number of links that were not scraped. So for example, if your search term only came back with 500 links, then your balance would be +500 links at the end of the campaign :).

    Regarding footprints - our system scrapes them no differently than GScraper/Scrapebox, it all comes down to what footprints and what keywords you feed it. If they aren't of quality, then you're going to get bad results unfortunately. Thanks so much for the review though pal, appreciate it!



    I want to remind everyone this is termed "BETA" for a reason - we expect bugs :D. However, my developer's time is completely put on fixing bugs during this phase, so we hope to get them all fixed within 24 hours after being reported. We are relying on you to help us debug this system to make it as perfect as possible - thank you so much!

    - Neil
  • I would like to try! Thanks.
  • If this is as good as your text captcha service and email service I'm deffo in mate. Would like to test it if still available thanks
  • magicallymagically http://i.imgur.com/Ban0Uo4.png
    @BanditIM

    Here is something I quite don't understand....Initially you wrote this in the very first post:

    Note: The Footprint system is not quite ready. We wanted to get the scraping part out as fast as possible and want to get that perfected first.

    I added the very same footprints GSA SER is using + keywords - That should bring results accordingly to those information, right?

  • I'm interested in having access to the beta testing


    Thanks
  • I am interested in doing a beta test review.  I believe if you can offer this to me, I can spare a few minutes of my day to type out a few lines for.  That's not too much to ask.
  • s4nt0ss4nt0s Houston, Texas
    I've only been beta testing for a an hour so but so far this thing is pretty impressive. I think one of the best parts about the service is that anyone can use it without having any prior scraping knowledge.

    As long as you understand how footprints/keywords work, you can start scraping right away. 

    Here's the steps it takes to start scraping:

    - Click create project
    - Name your campaign
    - Enter keywords
    - Enter footprints
    - Click start campaign

    It really doesn't get any easier then that. 

    The UI is very clean and straight forward.
    The scraping speed is very fast - I got my results returned in a matter of minutes.
    Not having to use and burn through my own proxies is a huge plus
    Not having to use my own server resources is another big plus

    My first small test run:
    Loaded 32 article footprints (pulled directly from SER)
    Loaded 14 random keywords

    Results:
    Unique URLS: 11157 
    Unique Domains: 4349

    After identify and sorting, about 18% of URLS were recognized as postable by SER:

    image

    Keep in mind this was using just a very small sample of 11,000 URLS. Loading in 1,000,000 URLS should yield some pretty good results.

    I think a lot of the other testers have pretty much covered the same suggestions I would have listed. Here's some more I'd like to see:

    1) Have a button to "clear" loaded keyword/footprint lists. Right now if I import a list, I don't see a way to clear it if I want to remove it.

    2) On the results page it would be cool if it listed out the results like: total URLs,  x unique URLS. x unique domains so we know the scraping results before downloading the file.

    That's all I can see for now. Can't wait to see how this service develops over the coming months.

    Another awesome service by Banditim :)
  • I'm definitely interested to try this out. The theory behind this idea is very clever, so lets see how this works in practice. 
  • If it is still possible, I would like to participate in this beta testing.
  • One More review, I,d really like to give you an honest review about the service

  • I'm interested in trying it to scrape foreign kws and see how it compares to hrefer
  • Tim89Tim89 www.expressindexer.solutions
    This service is practically a no brainer for anyone that harvests their own URLS, it takes away the need of having additional proxies/server for scraping and also increases productivity.

    @BanditIM‌ sorry about not replying yet, been a busy Monday, will look through the PM and get back to you.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent :)

    @magically -- Please refer to the beta thread in the original post. We have plans to create a 24/7 Footprint scraping system, which is not quite yet implemented. Completely unrelated to uploading your own footprint lists :)

    @s4nt0s - Thanks so much Mr. Devin for the awesome review and screenshot! We have both your suggestions already on our to-do list, just a matter of time before getting to them.

    Thanks everybody!
  • Shoot me a PM - would like to try this one
  • Even though I don't think people will only enter a few kw and footprints, there is a bug when I entered 2 kw and 2 fps because sw says campaign will use 4000 links

  • missed the point that its x1000 but its a bit confusing beacuse what happens if there are not 1000 results per
  • In help drop down - add a comment if inurl footprints are OK. 
  • Review time :)

    Brief description and explanation : 

    This service really does what it says, it's just like you have a web based version of gscraper, however without worrying about proxies and servers. Interface is very simple but in the same time really userfriendly, it won't take you more than a minute to get it running. 

    If you do know how to scrape the links and if you had success with gscraper, you'll only get even better results with this service, it's much faster as well. 

    Pros : 

    * Userfriendly interface
    * No need for proxies
    * Fast results
    * Automated deduping of duplicated domains and URLs
    * Multiple delivery options
    * Provides truly good links if you know how to scrape correctly 

    Cons : 

    * Service is automatically removing the credits before it scrape the links, that means you won't be able to input huge amount of footprints/keywords, no matter if your final result won't break your limit.


    Feature request :

    * Enable scrapping from different geo location googles ( google.co.uk, google.de etc ), probably just few lines of code :)
    * Enable filtering the links for custom terms, the same feature like gscraper have.
    * When someone use "inurl:footprint" in footprint section, collect only the links which really have that term in the URL
    * Filter the links by OBL, i know this feature would slow down the whole thing, but if you could do it i would pay extra for sure, because SER is checking that really slowly. 

    Bugs : 

    So far i only had two issues, first one is that i couldn't get the confirmation email so i had to register again with another email, but that's probably your SMTP issue and the second one is that one scrapping campaign was returned as empty, there were no links inside the text files in archive, but nothing major.


    That's it for now, i hope i'll be able to use the service for few more days, but i will definitely become your customer once this goes live for commercial purposes, all link sellers here can pack their bags after this goes live :).


  • So far, so good.

    It's fast as hell, returns decent list (if you provide decent footprints of course) and very convenient to use.

    The only bug I found is I can't start a project if I'm adding keywords through the form, but if I upload a file all works well (Safari on Mac).
  • edited November 2014
    Hi,

    On your post it states "However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!"

    When I log in it says I only get 2.4 million links.  Can you please look into this. 

    "You have 2400000 Request Remaining. Buy More?"

    @BanditIM
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @SuperSEO - It's already being looked into. We have other larger bugs we're sorting out, but that one is an easy fix. Will be fixed by the end of the night.

    @Crownvic - We're fixing it :). Thanks so much for the great review!

    @dariobl - Thanks SO much for the long and thorough review pal! We found out the email registration was due to Aweber blocking your email address (spammer huh? :P). As for the 0 results, haven't heard of that yet, definitely will be looking into your campaigns.

    I've wrote down your suggestions! What do you mean in regards to: "Enable filtering the links for custom terms, the same feature like gscraper have."

    @fng - If they don't return the full 1000 links back, we give you the credit back on the ones it didn't return :)
  • @BanditIM‌

    I used my company email address for first registering attempt, so its definitely not my email address issue :), but no issue I got it registered with anoher address.

    Regarding filtering feature, I meant to allow us to export only the links with specific words in the URL for example to export only the URLs with .co.uk in the name.
  • @banditim I am interested, waiting for pm.
  • Hi

    Tried the beta version for the first time today. This is my impression

    - a nice clean interface

    - very intuitive and easy to use

    - I loaded my campaign and within probably about twenty minutes I received a notification email for my results

    - I did not experience any technical issues and the scraper delivered the URLs as promised.

    So far I am impressed and happy with the Scraper!

     

  • @BanditIM If there is a spot still available I'll leave a review, thanks
  • edited November 2014
    Did my first 3 test projects yesterday night.

    Initial observations

    - Very clean interface and incredibly easy to use. Seriously, this has to be one of the easiest platforms I ever used. It's also very straightforward: you just upload your keywords and footprints and that's it. No learning curve at all.

    - Scraping seems pretty fast.

    - Great delivery format (3 lists; the entire list, unique domains and unique URLs). No need to sort the lists and deduplicate them. 

    - Awesome integration with Dropbox (although I haven't yet used that feature).

    Still some bugs like the copy/paste bug, only able to scrape 2.4 instead of 24 million urls and the inurl footprints don't seem to give the desired results (possible blocking advanced operators?), but the service has a lot of potential!
  • Hello

    I'll test it too if you still need beta testers.

    Thanks
  • I would like to beta test this service.

    Thanks!
  • I am interested in testing this new serviceand want to be beta tester if the offer still available
  • I am interested in doing beta test review.Thanks
  • donaldbeckdonaldbeck Advanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
    edited November 2014
    @BanditIM

    I'd echo a lot of what the other reviews said.

    There needs to be some way to upload a ton of keyword+footprint combos in one go. Uploading little chunks is too time consuming. Allowing the user to control the amount of scraped URLs would be nice to. IE being able to select 100 URLs scraped instead of 1000 all the time.

    Otherwise it looks nice!
  • am also interested pls pm me
  • The idea seems to be very interesting, so if still is opportunity to test the service, i'm really interested :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    We currently have a pretty large bug found from one of the user's uploaded list (something to do with a funky character). We're trying to find out what keyword is causing that exactly, so we may be down until tomorrow morning unfortunately. Sorry for the inconvenience - this is what beta testing is for though :)
  • BanditIM nevertheless still very much want to be a part in the beta testing
  • @BanditIM, bugs are great, hopefully I can uncover a few for you
  • Will put honest review as it can be,
    PM me :)
  • I tried service yesterday and so far I must say job done well, I think it have a potencial.
    Scraping is fast, cheaper and beginners friendly.

    So, I am pretty sure this service will be perfect for users with limited resources becouse:
    - no need for scraping software (Gscraper,Scrabox,Hreferer,..)
    - no need for VPS
    - no need for proxies
    - cheap
    - fast
    - easy to use

    Scraped URLs - results can be sent to e-mail automaticly after task is completed or uploaded to FTP or DropBox.
    I also like feature of sorting the results, like "full-list" "unique domains" "unique urls"


    I have only one question, what does "Enclose Footprints:" means?



  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @vuli - It means if you did not enclose your footprints with quotes beforehand (i.e. - powered by Wordpress), this checkbox would make it: "powered by Wordpress".
  • i am interested trying out this scraper and write a good review :)
This discussion has been closed.