ScraperBandit - Beta Testing Open To Public

*Edit* - ScraperBandit Service is now fully up and running! Check out the official sales thread for more details: https://forum.gsa-online.de/discussion/15247/scraperbandit-the-only-scraping-system-you-ll-ever-need?new=1
https://forum.gsa-online.de/discussion/10810/new-google-scraper-about-to-come/p1
Note: The Footprint system is not quite ready. We wanted to get the scraping part out as fast as possible and want to get that perfected first.
In a nutshell, this system will replace your current Google scraping servers at a much lower cost. People argue that they only pay a one time fee for their GScraper/Scrapebox, but what they don't count in their expenses are their servers and proxies to run these software's. Our system handles all the server usage, all the proxies, and most importantly the link handling. Also, would you rather be able to scrape 50 million total links in a month with your current setup, or 50 million links in a single day using our system? Your answer to this question depends on how much you want to increase your link building in GSA SER.
In our current beta version we allow FTP and DROPBOX upload which gives you the opportunity to constantly append to a certain text file in your Dropbox folder so that GSA SER can continually read that updated site list. How powerful is that?! Forget GSA scraping for you, forget having to constantly transfer over files from ScrapeBox/Gscraper to your sitelist file --- our system will handle it for you automatically!
The starting price (tentatively) will be $0.0000005 per link, which is $5 for 10 million scraped links. This price MAY change as we see our LPM increase/decrease according to the footprints given by people. However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!
To be a beta tester, you must give an honest review of the system once you've given it a test. I don't care if it's bad or good, as long as you're honest (and not a competitor of course). If you do not give a review after testing, we will unfortunately have to ban you from the system for further usage. Taking 2 minutes out of your day to post a review is not much to ask for 24 million links FREE each day, so please help us out!
Simply post below if you are interested and I will PM you with details on how to sign up for your free account.
Thank you so much for the continued support of BanditIM everyone, I really appreciate it!
- Neil Emeigh
BanditIM Admin
This discussion has been closed.
Comments
This service literally takes away your scraping work, it cuts costs for running a scraper + proxies, VPS, resources in general.
It's a set and forget / plug n play system, it'll continuously scrape you targets, the only thing you've got to make sure of, is to have the resources to process these lists
Nobody should have to purchase lists anymore with something like this, this has taken the tedious task of scraping and cuts costs dramatically, I mean, $5 for 10,000,000 URL's, with some of the list sellers selling lists for around $50 each for upto 100,000 links... You can obtain 50,000,000 potential targets for that price alone.
All you need is keywords + footprints.
Great work @BanditIM for creating such an innovative piece of online software, all the best!
My run was supposed to get just under the daily 24 mil, here are the results:
40,603 TOTAL urls scraped
27,530 unique urls
7,077 unique domains
Now, I can still do more scraping, as I didn't actually use the 24 mil links, but that means getting back in there and starting new campaigns until I use it up.
It would be easier if it let you set up a monstrous campaign that could last weeks, and just processes it in batches to stay within the daily limit.
Also, it appears the footprints did not work at all, so I doubt I will be able to build many, if any links with these urls. There were a ton from Amazon and Craigslist for example.
Once the footprint system is working, with a few tweaks - this thing is going to be a monster!
Have a nice day and WE wish you great wishes!!!
@Tim89 - Thanks so much for the fast and thorough review pal, glad you liked it. Still waiting on a PM back regarding Yahoo/Bing interest
@magically - Wow wow, thanks so much for the thorough review for everybody! We are looking into the copy/paste bug right now, as another user seems to be having similar problems with it. For now, everyone should just Upload the files - that seems to be working fine.
Regarding results - I have to remind everyone that it all comes down to the footprints and keywords you give
@almostbasic - I completely agree about the monster campaigns. We have it on our to-do, it just wasn't top priority so that we could concentrate getting the base system out first. Regarding the 24 million links, let me clarify what's happening:
We charge people 1000 link credits for each search term they submit. You're right, each search term will not get the full results, so when your order is sent back to us what we actually do is credit BACK the number of links that were not scraped. So for example, if your search term only came back with 500 links, then your balance would be +500 links at the end of the campaign
Regarding footprints - our system scrapes them no differently than GScraper/Scrapebox, it all comes down to what footprints and what keywords you feed it. If they aren't of quality, then you're going to get bad results unfortunately. Thanks so much for the review though pal, appreciate it!
I want to remind everyone this is termed "BETA" for a reason - we expect bugs
- Neil
One More review, I,d really like to give you an honest review about the service
@BanditIM sorry about not replying yet, been a busy Monday, will look through the PM and get back to you.
@magically -- Please refer to the beta thread in the original post. We have plans to create a 24/7 Footprint scraping system, which is not quite yet implemented. Completely unrelated to uploading your own footprint lists
@s4nt0s - Thanks so much Mr. Devin for the awesome review and screenshot! We have both your suggestions already on our to-do list, just a matter of time before getting to them.
Thanks everybody!
Even though I don't think people will only enter a few kw and footprints, there is a bug when I entered 2 kw and 2 fps because sw says campaign will use 4000 links
On your post it states "However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!"
When I log in it says I only get 2.4 million links. Can you please look into this.
"You have 2400000 Request Remaining. Buy More?"
@BanditIM
@Crownvic - We're fixing it
@dariobl - Thanks SO much for the long and thorough review pal! We found out the email registration was due to Aweber blocking your email address (spammer huh? :P). As for the 0 results, haven't heard of that yet, definitely will be looking into your campaigns.
I've wrote down your suggestions! What do you mean in regards to: "Enable filtering the links for custom terms, the same feature like gscraper have."
@fng - If they don't return the full 1000 links back, we give you the credit back on the ones it didn't return
I used my company email address for first registering attempt, so its definitely not my email address issue
Regarding filtering feature, I meant to allow us to export only the links with specific words in the URL for example to export only the URLs with .co.uk in the name.
Hi
Tried the beta version for the first time today. This is my impression
- a nice clean interface
- very intuitive and easy to use
- I loaded my campaign and within probably about twenty minutes I received a notification email for my results
- I did not experience any technical issues and the scraper delivered the URLs as promised.
So far I am impressed and happy with the Scraper!
Thanks!
Otherwise it looks nice!
PM me
Scraping is fast, cheaper and beginners friendly.
So, I am pretty sure this service will be perfect for users with limited resources becouse:
- no need for scraping software (Gscraper,Scrabox,Hreferer,..)
- no need for VPS
- no need for proxies
- cheap
- fast
- easy to use
Scraped URLs - results can be sent to e-mail automaticly after task is completed or uploaded to FTP or DropBox.
I also like feature of sorting the results, like "full-list" "unique domains" "unique urls"
I have only one question, what does "Enclose Footprints:" means?