Skip to content

ScraperBandit - Beta Testing Open To Public

13567

Comments

  • i am interested trying out this scraper and write a good review :)
  • When can we expect to see the scrapper running live again ? :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    It's back up, just getting everything sorted out to make sure it's perfect again. You can start posting campaigns again though @dariobl
  • I just tested out this service and I have to say it is amazing.  You will save a lot of money by using this service as you will not have to pay for the following:

    -dedicated server $60+
    -proxies $40+
    -software $70

    All you have to do is upload your footprints and keywords and it will automaticaly scrape for you.  It finished a 13 million URL scrape in a few hours.  No need to worry about anything.

  • I am interested in beta testing.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @SuperSEO - Thank you very much for the review, appreciated :)

    All PMs for beta testers above this post have been sent - enjoy everyone and please PM me any bugs you find!
  • I am interested in this service and will leave a review.  Please PM me the details.  Thanks
  • I am late to the party but if review copies are still available i'll take it.
  • I would like to test this scraper.
    Thanks in advance
  • I'd love to try out ScraperBandit! sounds intreguing

    Cheers
  • WOW

    I was one of the first beta testers - what was that like 2 days ago?   I swear it seems to be working better and better with every scrape. 

    CRITICAL:  We REALLY need a better way to set up campaigns.  With that credit system, I'm back in there setting up a campaign every hour or so.  The ideal setup:
    I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used,  pause it until the next day, then pick up where it left off.

    That's the only real issue I see with this.  Other than that - this is INCREDIBLE


  • Whoops.

    The last scrape was not delivered to Dropbox.

    I swear I ticked the option.
  • @almostbasic

    The way you recommend would cause a huge server overload, i'll rather stick with the current system but have a stable platform, than wait tens of hours to get a single campaign finalized :)

    Anyway i just want to say once again that this service is pure goldmine, i got more contextual links last 24 hours than in last 2 months purchasing various lists by various providers, wasting my time and money. 

    I just hope that the price will be affordable for this service too :) 

    I would pay $100 or a little bit more per month for what we have right now, 24m links per day, even if it would be less links, as i'm definitely not exporting that much because SER can't take it all :) Even 10m-15m would do the job definitely.
  • I am interested. I will reply with a honest review.
  • @banditim sent you pm,  please check.
  • interested too to test this baby and will write a honest review
  • Count me in, if there are any spots left open :)
  • Scraper Bandit..

    I have to say I am already amazed after just a little bit of use..

    Scraper Bandit saves your resources.. proxies.. cpu usage.. time.. and more

    You can set up all your tasks and walk away without having to babysit your projects..

    The interface is very simple and effective!

    It didnt bring back a huge list filled with crap like Gscraper does if not careful...

    Pretty sure you can add it to dropbox and read from a file to build links as you scrape as well..

    I will be testing some more but just want to say so far I am really liking Google Scraper from bandit team. Great work Neil!


  • @dariobl - see I think it would do quite the opposite.  By having scheduled campaigns, they will know how much demand there is, and can schedule things accordingly.  Right now - if a bunch of people go in at once and launch massive campaigns, then it's going to get hammered all at once.

    I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.

    So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc.  Then, roughly once an hour you'll get a fresh list of links delivered.

    This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.

    If this was more of a set it and forget it solution - I think it would be a game changer.  Don't get me wrong - I'm already on board, I love this system.  But I think it could be a lot better if it were more automated.
  • Hi @BanditIM
    found this thread really late, you need one more beta tester…

    If so, please count me in…

    I have a lot experience with ScrapeBox and Gscraper
    Thanks Marc
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent for beta access.

    @Crownvic - Looking into it, thanks for the report

    @almostbasic and @dariobl - Please see our update post :)  http://banditim.com/blog/scraperbandit-update-1/  Should be ready by tonight! Dariobl, you will be able to use the system as it is now by simply creating small campaigns. For users like almostbasic, he will have the option now to submit large campaigns and not have to worry about constantly loading in.

    @backlinkaddict - Thanks so much for the great review pal!


  • I'm interested to test this beast :)
  • after 2 days of testing it I have to say it is really great. I think it would be better to make "scheduled campaigns". So we could deliver many keywords and footprints and just set to deliver some results every day. Thanks to that we would not have to login and make some work over and over again.
    Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.

  • +1 For being able to schedule campaigns

    I would like to see a feature added to the product.  Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.

    Also, I am very interested in the pricing for this after beta testing is completed.  I hope it will be reasonable and affordable.

    @BanditIM

    Love the service you are providing and keep up the good work.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @adam and @SuperSEO - See our blog post about scheduled campaigns - http://banditim.com/blog. We are very close to having this implemented :).

    As for seeing keywords/footprints used, we have this feature already but we took it off now for simplicity sakes. It'll be added soon enough.

    Pricing - Until we see the demand and search terms users give us (hard or easy footprints for example) I have set the price at $5 for 10 million links. 
  • Would love a testdrive of the system. im in need for a system like this, lists and having SER scrape is not cutting it.
  • Just tested and till now it seems to be user friendly and time saving service.
  • edited November 2014

    Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.

    DropBox

    Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience

    Scraping

    I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below

    • Non De-Duped = 2.2 Million
    • De-duped based on URL = 710K
    • De-duped based on Domains = 270K

    On an average each search term gets around 120 results that's great too.

    GSA SER Submissions Of Scraped Targets

    Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.

    Enhancements If Possible

    1) Queue Updates While Job Running

    Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue.  Possibly displaying number for jobs in queue before mine will be more informative than the current situation.

    2) Keep The Inputs Alive In Create Campaign

    In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.

    So a possible Solution(for your developers to look into).

    1. While Clicking DropBox feature button, do a server side postback and keep those typed in session.
    2. Open the window using a JavaScript popup and once the work is done you may wish to refresh the create page
    3. While reloading put back the data in session in step 1

    You can get even better solutions with jQuery without server side postback if you wish me to suggest.

    Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.

  • Am an addicted user of scrapebox, if there is anything in this world that can scrape more than SB I will love to have it .

    Am much more interested in the beta test, to see if it can out perform SB, then I will surly give my review here without partiality
  • hi, can i test it ?
This discussion has been closed.