Skip to content

ScraperBandit - Beta Testing Open To Public

12357

Comments

  • Interested, let me try your great work!
  • Just. want to knopw if I can use it To scrap for just kw only without using footprints
  • edited November 2014
    OK, so here is my review

    After I setup the campaign in the evening, I was greeted with the " Building Campaign..." message until the process was finished. There was no progress notification.

    The campaign had just one footprint for a contextual engine and 15k additive words

    Campaign start time: 2014-11-29 10:50:44
    Campaign end time: 2014-11-29 22:27:10

    Results were presented in 3 files:
        121 k unique URLs
            6 k unique domains
    4,5306 k total URLs

    Same setup in hrefer run in parallel and using free proxies produced about 30% less results (not mentioning using own software, cpu power, bandwidth, electricity and so on)

    The system is in its infancy, but it's already stable enough to be used as a solid alternative to professional scrapers that require a lot more knowledge, hardware, software resources and substantial upfront investment.
  • I have one suggestion if possible.

    1. Would be nice to have an option to remove the well known domains from scraped files, like. ebay, amazon,youtube etc...
    2. Would be nice to have an option to remove extensions such as .jpg, .pdf.....

    This two options would save us a some time. Anyways even without these the service is just awesome :)
  • Tim89Tim89 www.expressindexer.solutions
    My current scrape has been going on since yesterday and still hasn't completed @BanditIM‌ chime in for me please :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited November 2014
    @jackrice - Yes you can

    @Seljo - It's on the todo list already. We have SO many plans to make this thing perfect, it just takes time :)

    @Tim89 - I was away for the weekend so was unable to keep an eye on the system. It appears there's a slight bug in delivering some campaigns (I assume it has to do with some funky characters). It'll be fixed by the end of the night.


    Everyone else - Will send out PMs shortly for beta access. Thank you to all for the awesome reviews of the service, we will be opening up link sales for those of you who want to scrape more a day fairly soon.

    I'm excited to announce that we will be releasing a VERY nice update by tonight that it deserves it's own version upgrade - ScraperBandit v1.1.

    Update Includes one, incredible, addition: scheduling campaigns.
    • We will now allow users to upload an infinite amount of keywords+footprints and the campaign will run indefinitely until your credits run out or until the campaign finishes.
    • You will be able to Play/Pause campaigns that you wish to hold off on until a later date.
    • Finally, you will be able to schedule campaigns to only scrape XXXX links per day :). So if some campaigns are more important than others and you don't have an endless pocket book, then you can have some campaigns scrape slower than others. Note: Currently we do not have the option to change this 'XXXX' number, so once it's set that campaign will run indefinitely until it's finished or until you Pause it.


    Stay tuned to your emails for the official release. In the meantime, we will be fixing up some bugs during the day so that everyone's campaigns finish up.

    Thanks for the support everybody, we're on our way to be the ONLY scraper you'll ever need again :)

  • Would love to get in on the beta version of this, if possible?  I use gscraper daily, faithfully and I would definitely love to see something like this.


  • Would love to give this a go. And have no problem providing feedback/reviews or whatever needed. Your email services were awesome when I used them and I am sure this will be too.

    Look forward to hearing from you if this opportunity is still avaliable.
  • Can I test this? I want to try.
  • @banditim - SERVICE REVIEW
    ScraperBandit delivers what it promises - clean and uniterrupted scraping. No need to worry about proxies or using only simple footprints. For me this tool is useful, if I want to scrape with some specific footprints, like inurl ,that is much harder to scrape without getting ban from google.
    Campaign was saved to my Dropbox folder and thats it. Simple and powerful, although I must admit I didn't check how long did it take to get the results.

    Possible upgrades:
    1. Remove duplicate domains while scraping (gscraper has this feature).
    2. Scrape specific timeframe e.g. past year (SB has this feature).


  • Please sign me up for beta test...Thanks
  • would like to participate in beta if possible.
  • bellathecat78bellathecat78 Cardiff, Wales
    Can I get a beta test too?
  • sing99sing99 Los Angeles
    I'm ready to test this.

  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent. This will be the final Beta round of testers. We will still give 500k+ link credits to all new sign ups when we go live tonight just so people can test the system still.

    Look to your emails for our BIG update as well as information on us going live. This email is extremely important if you do not want to burn your link credits.
  • Tim89Tim89 www.expressindexer.solutions
    Hey @BanditIM‌ what does keywords queued mean?
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @Tim89 - Sending an email blast right now. Check it out, it explains all the new Status messages as well as some very important information :)
  • Tim89Tim89 www.expressindexer.solutions
    Cool thanks mate will look through it.
  • I'm interested in testing it
  • It's a very quick and easy service to use. Certainly saves time on the traditional gscraper/scrapebox way of doing things.

    I think for most people it will be a no brainer to use this, particularly people who are new to scraping and don't already have the tools.

    It will be interesting to see how the cost works out in comparison to how I was doing things before, but I expect it will be favourable.
  • How do I signup ? 
  • Tonight I used what I call the SafeScrape footprints and a keyword list from Donald Beck.  It's just contextual, Web 2.0, articles, Wikis and Social Networks.

    I dumped 101 keywords and all the footprints.

    A few hours later I got this...

    Thanks for utilizing ScraperBandit.
    Your campaign SafeScrape-Keywords25 has finished.

    4388017 links found.
    1510226 unique links found
    319431 unique domains found

    Never would have completed this on my home VPN using Scrapebox.  It would have taken 24 hours to complete if not more.

    That's just amazing.  Already at 68 submitted so far.  It will take time as these aren't just trackbacks, blog comments and such but wow... so damn fast.
  • Well, what can I say? Thank you for letting me use your servers ,scraping for free!
    I used all footprints for wordpress (GSA footprints from search settings) and some keywords. Aftter 6 hours I had results.
    The full list of links can't even be counted by scrapebox :)
    I have 30k unique domain to play with(few millions links)...

    Please tell me the price and I will send all my servers for scrapebox to sleep.

    Please add some keywords scraper and a counter for credits maybe some colloured buttons and few links on youtube with explinations and you are ready to fill your bank accounts :)

    Your solution is the next step for google scraping!

    Congratulation!








  • Would like to beta test, please.
  • sing99sing99 Los Angeles
    After it appeared to not save my dropbox login...

    I got this error under my uploaded footprint and keyword files
    Error #-200: HTTP Error.

    It appears to be working after I uploaded footprints and keywords manually.

    But if it's not connected to dropbox, where will the output go?

    UPDATE: it did create the dropbox folder - there's just no confirmation of that.
  • edited December 2014


  • edited December 2014
    I suspect that I joined the beta a few minutes after you sent out the big email announcement, I certainly haven't received it.

    Are there any plans to scrape Google UK as well ?
  • edited December 2014
    I have not read through all the pages on this but have a couple of questions.

    Are you able to use site: inurl: footprints in this scraper (including combinations of them both)?

    If you can does it throttle them in anyway?

    How many footprint/keyword searches will it do in an hour?

    Also, what are you charged for? Total URL's or unique URL's found?

    Thanks
  • Yes, Neil this sounds awesome. Would definitely do a review and if beta goes well, will be a buyer also.. Love the concept.

    Beta Tester Waiting .. :):)
    B-)

  • Tim89Tim89 www.expressindexer.solutions
    @BanditIM am I able to download what I have scraped at the moment with the update that rolled out yesterday? I see that you have introduced the pause button, but I can't see a download button.

    It is still in the process of Pausing I think 'Paused - Pausing...' I can only think that a 'Download' button will appear once the campaign has paused succesfully?
This discussion has been closed.