Skip to content

ScraperBandit - The ONLY Scraping System You'll Ever Need!

BlazingSEOBlazingSEO http://blazingseollc.com/proxy
edited December 2014 in Buy / Sell / Trade
image
«1345

Comments

  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited December 2014
  • PLEASE Don't look at more reviews - Just get started with ScrapeBandit! I was an BETA tester and I could say I gave my [I DONT CARE] GSCraper [I DONT NEED GScraper now because of ScraperBandit] licenses free of cost to needy person because ScrapeBandit doing everything for me 24X7 without any hassle and no more extra costs for servers, proxies ect..

    this is AWESOME! 10/10 +1 from me for great service.
    Thanks guys! - I hope to see yours PBNs [CMS/Subdomains = installations automation service]
  • ScraperBandit is a very good "online tool" with which you can build your own lists very quickly. I scraped 100k urls in less than five minutes using this service and i highly recommend it to everyone which use GSA SER or any other link building tool.

    Great job guys!
  • Hi,

    Can you fix this error on my admin panel -

    "Please Confirm your email address before using this service"


    I'm already a member and buy email accounts from you.

  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited December 2014
    @mrlinks - You must subscribe to the mailing list for Scraper Bandit as that is the only way you will receive updates for the system. We divide our mailing lists up because:

    1. I don't send out promotional junk... and
    2. The things I do send out only relate to my services. So by being subscribed to the lists that you use services for means you'll only get information you really care about :)

    @adystanley and @ServicesReseller - Thanks so much for the reviews guys, really appreciate it!
  • I want say this is a new great service, start this from yesterday, the results depends on the footprints and the value of keywords you use! First i use only a 10-15 keywords for a extreme niche product and search for contextual links article and Wiki 30K results in no time !

    Today try to search with 2K Keywords only article footprint, received 1,6 million sites and a second list with 1,6 Million links for a second seach with 2K Keywords on only contextual links this is amazing.

    The best this has costs under 1$, more than 15Million links remain from my link credit purchase from only 6,46$

    This is a must have service for GSA users who want update the GSA SER list daily with a punch of fresh links.
  • davbeldavbel UK
    edited December 2014
    I've used Scrapebox historically to build my own lists for SER and I've got some fairly well "honed" footprints and techniques that I've been using to run SB pretty much 24-7.

    When I started on the beta test I ran some test scrapes using the exact same footprints and keywords on both Scrapebox and Scraperbandit and Scraperbandit blew Scrapebox out of the water in terms of speed.

    The end results were more or less the same, but where as Scrapebox took more than 13 hours to complete, Scraperbandit was done in about 1 hour...

    As others have already said, it really couldn't be any easier to use.  Grab some decent footprints, paste them in, paste in your keywords and decide how you'd like the results to be delivered then click start.

    Literally that easy to get lists of potential SER sites.

    +1
  • For the longest I been trying to get a huge wiki list for GSA Ser. Then I seen this very ad for Scraper-Bandit. So I tried them out, and I got over 94,000 edu wiki links PR 3- PR 8 that's an entire site list of wikis. The best thing is its set it and forget it and when its done they send you an email then you just down load your links.

    You get so much more for less. Why pay almost $40-$50 a site list. I payed only $7.50 for 25000000 links to be scraped. Big difference huh?

    I love that it's totally web based, no more bans by scrape box, burned proxies, taking forever to scrape 1,000 links. If you haven't signed up, shoot over there now and scrape a killer site list for GSA Ser.

  • Web based Google Scraper is a Different Concept for me so i gave it a shot, Signed up and Received 100k Links for free. I created Campaigns and everything went smooth..

    There is nothing Top level You need to know to use Banditim scraper, Even a newbie can start easily i agree with this statement.

    My Campaign started Within Minutes and was displayed Download button with 3 different Option.

    I had Couple of Questions which i asked OP and He was quick to solve and answer them.

    Looking forward to use this scraper in future. Thanks !!!
  • Exactly what I was looking for and without a shadow of a doubt one of the most worthy SEO services out there  :)
  • If I sold lists I'd be worried, very worried! Here's a few reasons why:
    1. The bandit scraper is much cheaper than any list on this forum
    2. The resultant URL's are keyword targeted which Google is much more likely to love
    3. Because the bandit scraped lists are tailor made they are not going to be spammed the hell out of so stand more chance of sticking.
    I have been learning how to scrape my own lists recently. Now I don't need to. And I don't need to fork out all the expense of scraping either :)
    This is a game changer.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @puntoGT @davbel @ Thebeast @bangkoklad @Don

    Thank you all so much for the kind words - it's what keeps me going and making more top quality products :)

    If you have any questions at all you know how to reach me!
  • To Whom.........

    The very first thing that really impressed me was the helpful nature and above all else their attention to details and their willingness to sort out the small bugs in what seemed like only a few minutes.

    Great Start
  • When you scrape is it really a fresh scrape from SE, or are you building/refreshing a huge database where you clients scrape from 
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @cardman - Thanks so much for the kind words, we try our best to satisfy our customers before all else.

    @PaulieP - Right now 90% (if not more, we don't have 100% accurate stats yet) of the search terms are fresh scrapes. Once we've been live for long enough the results will be returned from a massive database we are compiling so it saves us on some proxy usage. Why waste proxy usage when we scraped for the search term just 2 hours ago? :) That's the flaw with Scrapebox/GScraper, everyone is wasting money on proxies, and EVERYONE is scraping the same darn things. We haven't determined a period yet that we will "refresh" each search term, but I imagine in the 4-14 day range (Google doesn't update THAT often to be too much of a menace. Plus, old results work just as well as new results when we're only talking a few days difference in the refresh time)
  • varthdavervarthdaver Sydney, Australia
    Now this is cool.  It comes out about the same cost as a VPS + Proxies, but they have so many proxies you'll never burn them out with harder "inurl:" scrapes.  What would take a month of scraping yourself, done by these guys in just over a day.

    Plus the quality support we've come to expect from BanditIM?  It's a no brainer, fellas!
  • Honestly speaking I was never excited about scraping my own targets. Seemed like too much trouble. 
    This tool, however, makes it easy. It really does. Did a small run yesterday. It was easy to set up and the results were good. It's good to see a user-friendly tool in SEO/IM business.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @varthdaver @vladinla

    Thanks so much for the kind words guys, really appreciate it! I try my best to keep my customers happy, so I'm glad your reviews reflect that :)
  • Would be good option in Scraper Bandit that we be able to set local servers that suit us best for scrape campaign, like in SB, i think that option will be good for all users ?

    I don't believe that Srcaper Bandit recognizing language from inputed keywords and on dependation of that it switch to that language local by himself  LOL :D
  • That is what i thought ( i would have done the same thing :D ) as long as you keep the database fresh enough, especially for the bigger niches  there shouldn't be a problem, its like buying a list only with raw links this time, i am going to give it a try. 

    Thanks 
  • Trying to get it. Your website is down?
  • That is what kind of service actually I was looking for :D , scraped the links very fast although you give so many keywords and footprints there, very simple to create projects and maintain them, when scraping complete it gives automatic notification via email :) , this is really cool, I scraped over 300000 links and got reasonable amount of unique domains and around 56% links were identified by GSA Search Engine Ranker. :D

    Really appreciate BanditIM's effort and recommended his service to all (y)
  • Any update regarding the maintenance? It`s been for more than 6 hours.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited December 2014
    @kajzer - Currently we only use google.com, but what you need to realize is that when you have a proxy (let's say from Europe) and go to google.com, Google routes your search to the nearest datacenter to you, meaning local results. We will add in functionality for the other domains as they will be even more accurate, but not for it's not a top priority unfortunately.

    @PaulieP - Thanks man, hope you like it!

    @jonathanjon @adystanley - Found a rather major bug VERY late last night. My dev stayed up until 7 am trying to fix it and couldn't quite get it before his eyes shut on him :P. He will be awake in an hour or so and should finish it would very shortly. I'll be sending out an email to everyone to let everyone know. Sorry for the inconvenience - this is why we're in "Early Bird Pricing", for situations just like these.

    @tayeebbd - Thanks so much, glad to see such a high success with GSA identifying!!


    Edit: We'll be back live in the next 30 minutes. Bug found and destroyed :)
  • I'm in the process of evaluation ScrapeBandit. Though I'm currently using another service to get a list of sites to use with GSA, I like the idea of automagically supplying GSA with sites to post to.

    There is another service that seems similar in functionality, but I sound them to be a little over my budget. But I like the idea they and ScrapeBandit have and felt the ScapeBandit made it a bit more affordable for me.

    Right now, I have 9 sites and I'm using different tier models to promote them. Hence; having a large list of site to post to is extremely important to me.

    ScrapeBandit seems to be focusing on quality service and fair price, so I'm giving them a try as we speak. The process is very simple;

    • You log into their service site
    • Select "Create" from the left panel
    • Give your project a name
    • Add your Keywords
    • Add your Footprints
    • Click "Start Campaign"
    All relatively easy to use.

    For my trial run. I used the footprints provided by GSA. 

    I don't have any results to share as of yet because I just started last night and unfortunately they reported a major bug which took them all night to fix. However; I took a look at my status and so far the service has completed 815 sites. I'm working in a very small niche and suspect I'm not going to get hundreds of thousand sites. But so far it sure beats using other tools.
  • Hello

    Any API ?
    Need such a service, but want to pilot it through apps.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited December 2014
    @joel - It's definitely possible, but with the huge to-do list and bugs we are ironing out each day you would need to have a rather large request for us to make it a top priority right now :(. In fact, it IS on our to-do list anyways (I could see SERP trackers and other big GSA list builders and such using an API a lot)... but it's one of many on the lists haha.
  • Feature requests:

    low priority -> once a campaign is completed, ability to review campaign details.  (i can not see the list of keyword & footprint used)
    medium priority ->  in deliveries, add a  "keyword&footprint" collum in delivey. So, for each result, i can tell what keyword&footprint found it. So i can make stats & optimize my footprint use.
    Regards


  • edited December 2014
    Anyway, i seems like your service is gonna reshape the scraping market.  nice job
  • edited December 2014
    I have tried it and damn i am doing away with Gscraper when it comes to building links, i love the fact that you get 3 files with unique domains, unique urls and a full list 

    I got the free trial of 100K i believe, i used some footprints for wordpress article sites and i scraped a whooping 62234 unique domains. 

    I don't know what they do in SER but i guess it will be fine, i will be using this service a lot.

    Great job and if you guy's arestill on the fence just give it a free test, i'll bet you will be surprised by the power en convenience

    My name is Paul and i approve this service :D  . 
Sign In or Register to comment.