Skip to content
  • When can we expect to see the scrapper running live again ? :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    It's back up, just getting everything sorted out to make sure it's perfect again. You can start posting campaigns again though @dariobl
  • I just tested out this service and I have to say it is amazing.  You will save a lot of money by using this service as you will not have to pay for the following:

    -dedicated server $60+
    -proxies $40+
    -software $70

    All you have to do is upload your footprints and keywords and it will automaticaly scrape for you.  It finished a 13 million URL scrape in a few hours.  No need to worry about anything.

  • I am interested in beta testing.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @SuperSEO - Thank you very much for the review, appreciated :)

    All PMs for beta testers above this post have been sent - enjoy everyone and please PM me any bugs you find!
  • I am interested in this service and will leave a review.  Please PM me the details.  Thanks
  • I am late to the party but if review copies are still available i'll take it.
  • I would like to test this scraper.
    Thanks in advance
  • I'd love to try out ScraperBandit! sounds intreguing

    Cheers
  • WOW

    I was one of the first beta testers - what was that like 2 days ago?   I swear it seems to be working better and better with every scrape. 

    CRITICAL:  We REALLY need a better way to set up campaigns.  With that credit system, I'm back in there setting up a campaign every hour or so.  The ideal setup:
    I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used,  pause it until the next day, then pick up where it left off.

    That's the only real issue I see with this.  Other than that - this is INCREDIBLE


  • Whoops.

    The last scrape was not delivered to Dropbox.

    I swear I ticked the option.
  • @almostbasic

    The way you recommend would cause a huge server overload, i'll rather stick with the current system but have a stable platform, than wait tens of hours to get a single campaign finalized :)

    Anyway i just want to say once again that this service is pure goldmine, i got more contextual links last 24 hours than in last 2 months purchasing various lists by various providers, wasting my time and money. 

    I just hope that the price will be affordable for this service too :) 

    I would pay $100 or a little bit more per month for what we have right now, 24m links per day, even if it would be less links, as i'm definitely not exporting that much because SER can't take it all :) Even 10m-15m would do the job definitely.
  • I am interested. I will reply with a honest review.
  • @banditim sent you pm,  please check.
  • interested too to test this baby and will write a honest review
  • Count me in, if there are any spots left open :)
  • @dariobl - see I think it would do quite the opposite.  By having scheduled campaigns, they will know how much demand there is, and can schedule things accordingly.  Right now - if a bunch of people go in at once and launch massive campaigns, then it's going to get hammered all at once.

    I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.

    So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc.  Then, roughly once an hour you'll get a fresh list of links delivered.

    This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.

    If this was more of a set it and forget it solution - I think it would be a game changer.  Don't get me wrong - I'm already on board, I love this system.  But I think it could be a lot better if it were more automated.
  • Hi @BanditIM
    found this thread really late, you need one more beta tester…

    If so, please count me in…

    I have a lot experience with ScrapeBox and Gscraper
    Thanks Marc
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent for beta access.

    @Crownvic - Looking into it, thanks for the report

    @almostbasic and @dariobl - Please see our update post :)  http://banditim.com/blog/scraperbandit-update-1/  Should be ready by tonight! Dariobl, you will be able to use the system as it is now by simply creating small campaigns. For users like almostbasic, he will have the option now to submit large campaigns and not have to worry about constantly loading in.

    @backlinkaddict - Thanks so much for the great review pal!


  • I'm interested to test this beast :)
  • after 2 days of testing it I have to say it is really great. I think it would be better to make "scheduled campaigns". So we could deliver many keywords and footprints and just set to deliver some results every day. Thanks to that we would not have to login and make some work over and over again.
    Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.

  • +1 For being able to schedule campaigns

    I would like to see a feature added to the product.  Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.

    Also, I am very interested in the pricing for this after beta testing is completed.  I hope it will be reasonable and affordable.

    @BanditIM

    Love the service you are providing and keep up the good work.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @adam and @SuperSEO - See our blog post about scheduled campaigns - http://banditim.com/blog. We are very close to having this implemented :).

    As for seeing keywords/footprints used, we have this feature already but we took it off now for simplicity sakes. It'll be added soon enough.

    Pricing - Until we see the demand and search terms users give us (hard or easy footprints for example) I have set the price at $5 for 10 million links. 
  • Would love a testdrive of the system. im in need for a system like this, lists and having SER scrape is not cutting it.
  • Just tested and till now it seems to be user friendly and time saving service.
  • edited November 2014

    Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.

    DropBox

    Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience

    Scraping

    I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below

    • Non De-Duped = 2.2 Million
    • De-duped based on URL = 710K
    • De-duped based on Domains = 270K

    On an average each search term gets around 120 results that's great too.

    GSA SER Submissions Of Scraped Targets

    Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.

    Enhancements If Possible

    1) Queue Updates While Job Running

    Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue.  Possibly displaying number for jobs in queue before mine will be more informative than the current situation.

    2) Keep The Inputs Alive In Create Campaign

    In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.

    So a possible Solution(for your developers to look into).

    1. While Clicking DropBox feature button, do a server side postback and keep those typed in session.
    2. Open the window using a JavaScript popup and once the work is done you may wish to refresh the create page
    3. While reloading put back the data in session in step 1

    You can get even better solutions with jQuery without server side postback if you wish me to suggest.

    Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.

  • Am an addicted user of scrapebox, if there is anything in this world that can scrape more than SB I will love to have it .

    Am much more interested in the beta test, to see if it can out perform SB, then I will surly give my review here without partiality
  • hi, can i test it ?
  • Problem is GSA SER takes so long to just ingest the 700k targets. It is already more than two hours only 24k is taken. So I can assume it is going to take so long for me to check the GSA SER results. 
  • The last campaign has a "Keywords Queued!" status for more than 6 hours for now.

    High load on servers? Or a bug?
  • I can't create any campaigns :(
  • I'll test it and leave you an honest review.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    Please refer to banditim.com/blog for our recent update.

    @Seljo - Why can't you?
  • can't create campaigns either
  • Are Beta testing slots available? If so I'd be glad to test and leave a review.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    edited November 2014
    @convert2seo - Try uploading files instead of using the type-in fields. I believe that's the bug everyone is encountering and some uses are getting campaigns in just fine by uploading now.

    @sarav - We have plans to include a standalone identify/sort for GSA lists that will automatically be ran after results come back... this will save you LOTS of time :)
  • I'd like to beta test. Let me know if that's still possible? Thanks.
  • Yes uploading files got me scraping on steroids :)


  • Hi  I would like to test your ScraperBandit, it looks very promising. Price are also cheap. Can't wait :).
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    All PMs sent.

    We have fixed all bugs from today (despite it being Thanksgiving holiday). Scraper is running at full speed and you are able to use the text fields instead of uploading again. 

    Due to the bugs, we will be pushing out a more advanced scheduler hopefully by tomorrow night. The current scheduler simply takes away all credits until you replenish credits (better than the current system that only allows you to upload as many search terms as your credits allow). The advanced scheduler we are implementing will allow you to Play/Pause campaigns, as well as schedule "Create X links per day" for each campaign. It'll give you MUCH more control over your campaigns :)
  • KevinRBKevinRB Chiang Mai, Thailand
    Hi Neil
    Any indication yet as to the real launch date?.
    Regards
    Kevin
  • edited November 2014
    i am interested in beta testing
  • @BanditIM now copy paste works, but with files it stopped working.

    There still must be a bug
  • REVIEW time

    okay this could be game changer for me . for 24mn links scraping (ignoring dups etc) takes me around 2-3 hours of public proxy scraping and then a day of scraping. Then 5-10 mins of deleting dups . With this service I can either cancel my scrapers vps or use it for other purposes. Huge money + time + resourse saver .
  • @BanditIM

    Option to exclude urls from no domains( ipaddress) will be great.

  • Interested in beta testin :)
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    @KevinRB - Once we push our big update out about the scheduling campaigns we will open up purchasing credits. During the beta phase still we will be giving discounted links, so it may be in most people's interest to stock pile on these discounted links while they can :)

    @Seljo - Darn, we fix one bug and it causes another... I'll have the dev look into it.

    @sarav - GSA can post to IP address URLs though. I'll write it down though on our todo list.
  • edited November 2014
    I'm interested in testing it    :-h
  • I haven't yet uploaded an official review but I have to say... you can scrape usable URLs in about 1 hour in what takes me or would take me 24 hours and probably a lot more to do.  This is just flabbergasting.
  • edited November 2014
    waho i just have a hand feel of this tool this is something out of this world you don't need to spend money in getting server and proxies for scraper list

    i just played with it doing a test drive got about 22,000 unquie domains and about 58,000 urls this is great far better than scrape box and gscraper
  • i just uesed it this morning it is a BOMB, just as the developer said you don't need server or proxies to run this muster  (tool) very easy to understand  and it does what it says

    have this to say, THE END OF SCRAPEBOX AND GSCRAPER AND OTHER SCRAPER IN THE MARKET HAS COME

    but one recommendation I have is that it should be made a monthly fee not credit cos any one can easily finish up the credit in one campaign

    also the footprint should be preinstall in the tool since itis meant for scraping gsa footprint then it should be be stored there apparently such that user can edith   add or remove or modify it to there taste
  • [My English language is very weak so sorry about this]

    I just use this Scraper last night so here is my result.
    I used around 24000 Keywords and 1 Footprint for demo.
    There are still 16376324 Links Remaining on this demo package
    Here is what i got>
    Time Start                  --  Time End
    2014-11-28 09:03:37  --  2014-11-28 11:51:15

    Links I got>
    7623676 links found.
    4817028 unique links found
    1183102 unique domains found

    Complain>
    First of all I did not get verification email but then i just apply for forget password then i receive verification email and random password.
    But now there is no option for changing password.

  • As of now I'm getting so many good results that if this is priced right, I'm ditching gscraper. 

    So many good results (thanks to my footprints), with only 5mio links spent I have made a very nice verified list for GSA, without the crappy platforms :)

    Very very satisfied :)

  • This is just unbelievable I scraped just in 30/40 minutes a list that I would do in a month with Scrapebox. This is not normal. All I can say, thanks for testing your tool. It's a tool that a lot of people willing to pay for. I'm 100% SURE. 
  • thoughts and review. I must say i like this service alot. Havent been scraping much for SER as i wanted to use the inbuilt scraper, but i ive scraped quite alot for scrapebox and ultimate demon in the past. Its time consuming and gets messy with all the processing and keeping track of all the files needed. This clean and simple, and with the nature of SER`s target list capabilities. I simply run the scrape, take the output and feed it to the software, and im done. Im gonna use this service alot in the future
  • Hey BanditIM...I'm interested in beta testing this.  So far, reviews look outstanding!
  • Tim89Tim89 www.expressindexer.solutions
    It's an outstanding innovative web service, that works well.

    Should have been made a long time ago! Lazy bugger @BanditIM :)
  • Me too interested in testing this service. 
  • Did a test with Korean KWs. Scraper Bandit is a super app.

    One suggestion. Add a sieve filter like Hrefer.

    Sieve filter = Scraped links are checked against the sieve filters. Anything not matching the sieve filters will be discarded.

     

     

  • I have started the campaign few hours ago and the status is till " Building Campaign..." @BanditIM , is everything working fine tonight ? 
This discussion has been closed.