Skip to content

ScraperBandit - Beta Testing Open To Public

24567

Comments

  • Tim89Tim89 www.expressindexer.solutions
    This service is practically a no brainer for anyone that harvests their own URLS, it takes away the need of having additional proxies/server for scraping and also increases productivity.

    @BanditIM‌ sorry about not replying yet, been a busy Monday, will look through the PM and get back to you.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent :)

    @magically -- Please refer to the beta thread in the original post. We have plans to create a 24/7 Footprint scraping system, which is not quite yet implemented. Completely unrelated to uploading your own footprint lists :)

    @s4nt0s - Thanks so much Mr. Devin for the awesome review and screenshot! We have both your suggestions already on our to-do list, just a matter of time before getting to them.

    Thanks everybody!
  • Shoot me a PM - would like to try this one
  • Even though I don't think people will only enter a few kw and footprints, there is a bug when I entered 2 kw and 2 fps because sw says campaign will use 4000 links

  • missed the point that its x1000 but its a bit confusing beacuse what happens if there are not 1000 results per
  • In help drop down - add a comment if inurl footprints are OK. 
  • Review time :)

    Brief description and explanation : 

    This service really does what it says, it's just like you have a web based version of gscraper, however without worrying about proxies and servers. Interface is very simple but in the same time really userfriendly, it won't take you more than a minute to get it running. 

    If you do know how to scrape the links and if you had success with gscraper, you'll only get even better results with this service, it's much faster as well. 

    Pros : 

    * Userfriendly interface
    * No need for proxies
    * Fast results
    * Automated deduping of duplicated domains and URLs
    * Multiple delivery options
    * Provides truly good links if you know how to scrape correctly 

    Cons : 

    * Service is automatically removing the credits before it scrape the links, that means you won't be able to input huge amount of footprints/keywords, no matter if your final result won't break your limit.


    Feature request :

    * Enable scrapping from different geo location googles ( google.co.uk, google.de etc ), probably just few lines of code :)
    * Enable filtering the links for custom terms, the same feature like gscraper have.
    * When someone use "inurl:footprint" in footprint section, collect only the links which really have that term in the URL
    * Filter the links by OBL, i know this feature would slow down the whole thing, but if you could do it i would pay extra for sure, because SER is checking that really slowly. 

    Bugs : 

    So far i only had two issues, first one is that i couldn't get the confirmation email so i had to register again with another email, but that's probably your SMTP issue and the second one is that one scrapping campaign was returned as empty, there were no links inside the text files in archive, but nothing major.


    That's it for now, i hope i'll be able to use the service for few more days, but i will definitely become your customer once this goes live for commercial purposes, all link sellers here can pack their bags after this goes live :).


  • So far, so good.

    It's fast as hell, returns decent list (if you provide decent footprints of course) and very convenient to use.

    The only bug I found is I can't start a project if I'm adding keywords through the form, but if I upload a file all works well (Safari on Mac).
  • edited November 2014
    Hi,

    On your post it states "However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!"

    When I log in it says I only get 2.4 million links.  Can you please look into this. 

    "You have 2400000 Request Remaining. Buy More?"

    @BanditIM
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @SuperSEO - It's already being looked into. We have other larger bugs we're sorting out, but that one is an easy fix. Will be fixed by the end of the night.

    @Crownvic - We're fixing it :). Thanks so much for the great review!

    @dariobl - Thanks SO much for the long and thorough review pal! We found out the email registration was due to Aweber blocking your email address (spammer huh? :P). As for the 0 results, haven't heard of that yet, definitely will be looking into your campaigns.

    I've wrote down your suggestions! What do you mean in regards to: "Enable filtering the links for custom terms, the same feature like gscraper have."

    @fng - If they don't return the full 1000 links back, we give you the credit back on the ones it didn't return :)
  • @BanditIM‌

    I used my company email address for first registering attempt, so its definitely not my email address issue :), but no issue I got it registered with anoher address.

    Regarding filtering feature, I meant to allow us to export only the links with specific words in the URL for example to export only the URLs with .co.uk in the name.
  • @banditim I am interested, waiting for pm.
  • Hi

    Tried the beta version for the first time today. This is my impression

    - a nice clean interface

    - very intuitive and easy to use

    - I loaded my campaign and within probably about twenty minutes I received a notification email for my results

    - I did not experience any technical issues and the scraper delivered the URLs as promised.

    So far I am impressed and happy with the Scraper!

     

  • @BanditIM If there is a spot still available I'll leave a review, thanks
  • edited November 2014
    Did my first 3 test projects yesterday night.

    Initial observations

    - Very clean interface and incredibly easy to use. Seriously, this has to be one of the easiest platforms I ever used. It's also very straightforward: you just upload your keywords and footprints and that's it. No learning curve at all.

    - Scraping seems pretty fast.

    - Great delivery format (3 lists; the entire list, unique domains and unique URLs). No need to sort the lists and deduplicate them. 

    - Awesome integration with Dropbox (although I haven't yet used that feature).

    Still some bugs like the copy/paste bug, only able to scrape 2.4 instead of 24 million urls and the inurl footprints don't seem to give the desired results (possible blocking advanced operators?), but the service has a lot of potential!
  • Hello

    I'll test it too if you still need beta testers.

    Thanks
  • I would like to beta test this service.

    Thanks!
  • I am interested in testing this new serviceand want to be beta tester if the offer still available
  • I am interested in doing beta test review.Thanks
  • donaldbeckdonaldbeck Advanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
    edited November 2014
    @BanditIM

    I'd echo a lot of what the other reviews said.

    There needs to be some way to upload a ton of keyword+footprint combos in one go. Uploading little chunks is too time consuming. Allowing the user to control the amount of scraped URLs would be nice to. IE being able to select 100 URLs scraped instead of 1000 all the time.

    Otherwise it looks nice!
  • am also interested pls pm me
  • The idea seems to be very interesting, so if still is opportunity to test the service, i'm really interested :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    We currently have a pretty large bug found from one of the user's uploaded list (something to do with a funky character). We're trying to find out what keyword is causing that exactly, so we may be down until tomorrow morning unfortunately. Sorry for the inconvenience - this is what beta testing is for though :)
  • BanditIM nevertheless still very much want to be a part in the beta testing
  • @BanditIM, bugs are great, hopefully I can uncover a few for you
  • Will put honest review as it can be,
    PM me :)
  • I tried service yesterday and so far I must say job done well, I think it have a potencial.
    Scraping is fast, cheaper and beginners friendly.

    So, I am pretty sure this service will be perfect for users with limited resources becouse:
    - no need for scraping software (Gscraper,Scrabox,Hreferer,..)
    - no need for VPS
    - no need for proxies
    - cheap
    - fast
    - easy to use

    Scraped URLs - results can be sent to e-mail automaticly after task is completed or uploaded to FTP or DropBox.
    I also like feature of sorting the results, like "full-list" "unique domains" "unique urls"


    I have only one question, what does "Enclose Footprints:" means?



  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @vuli - It means if you did not enclose your footprints with quotes beforehand (i.e. - powered by Wordpress), this checkbox would make it: "powered by Wordpress".
  • i am interested trying out this scraper and write a good review :)
This discussion has been closed.