I just tested out this service and I have to say it is amazing. You will save a lot of money by using this service as you will not have to pay for the following:
-dedicated server $60+ -proxies $40+ -software $70
All you have to do is upload your footprints and keywords and it will automaticaly scrape for you. It finished a 13 million URL scrape in a few hours. No need to worry about anything.
I was one of the first beta testers - what was that like 2 days ago? I swear it seems to be working better and better with every scrape.
CRITICAL: We REALLY need a better way to set up campaigns. With that credit system, I'm back in there setting up a campaign every hour or so. The ideal setup: I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used, pause it until the next day, then pick up where it left off.
That's the only real issue I see with this. Other than that - this is INCREDIBLE
The way you recommend would cause a huge server overload, i'll rather stick with the current system but have a stable platform, than wait tens of hours to get a single campaign finalized
Anyway i just want to say once again that this service is pure goldmine, i got more contextual links last 24 hours than in last 2 months purchasing various lists by various providers, wasting my time and money.
I just hope that the price will be affordable for this service too
I would pay $100 or a little bit more per month for what we have right now, 24m links per day, even if it would be less links, as i'm definitely not exporting that much because SER can't take it all Even 10m-15m would do the job definitely.
@dariobl - see I think it would do quite the opposite. By having scheduled campaigns, they will know how much demand there is, and can schedule things accordingly. Right now - if a bunch of people go in at once and launch massive campaigns, then it's going to get hammered all at once.
I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.
So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc. Then, roughly once an hour you'll get a fresh list of links delivered.
This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.
If this was more of a set it and forget it solution - I think it would be a game changer. Don't get me wrong - I'm already on board, I love this system. But I think it could be a lot better if it were more automated.
@Crownvic - Looking into it, thanks for the report
@almostbasic and @dariobl - Please see our update post http://banditim.com/blog/scraperbandit-update-1/ Should be ready by tonight! Dariobl, you will be able to use the system as it is now by simply creating small campaigns. For users like almostbasic, he will have the option now to submit large campaigns and not have to worry about constantly loading in.
after 2 days of testing it I have to say it is really great. I think it would be better to make "scheduled campaigns". So we could deliver many keywords and footprints and just set to deliver some results every day. Thanks to that we would not have to login and make some work over and over again. Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.
I would like to see a feature added to the product. Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.
Also, I am very interested in the pricing for this after beta testing is completed. I hope it will be reasonable and affordable.
As for seeing keywords/footprints used, we have this feature already but we took it off now for simplicity sakes. It'll be added soon enough.
Pricing - Until we see the demand and search terms users give us (hard or easy footprints for example) I have set the price at $5 for 10 million links.
Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.
DropBox
Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience
Scraping
I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below
Non De-Duped = 2.2 Million
De-duped based on URL = 710K
De-duped based on Domains = 270K
On an average each search term gets around 120 results that's great too.
GSA SER Submissions Of Scraped Targets
Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.
Enhancements If Possible
1) Queue Updates While Job Running
Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue. Possibly displaying number for jobs in queue before mine will be more informative than the current situation.
2) Keep The Inputs Alive In Create Campaign
In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.
So a possible Solution(for your developers to look into).
While Clicking DropBox feature button, do a server side postback and keep those typed in session.
Open the window using a JavaScript popup and once the work is done you may wish to refresh the create page
While reloading put back the data in session in step 1
You can get even better solutions with jQuery without server side postback if you wish me to suggest.
Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.
Problem is GSA SER takes so long to just ingest the 700k targets. It is already more than two hours only 24k is taken. So I can assume it is going to take so long for me to check the GSA SER results.
Comments
-dedicated server $60+
-proxies $40+
-software $70
All you have to do is upload your footprints and keywords and it will automaticaly scrape for you. It finished a 13 million URL scrape in a few hours. No need to worry about anything.
All PMs for beta testers above this post have been sent - enjoy everyone and please PM me any bugs you find!
Thanks in advance
I was one of the first beta testers - what was that like 2 days ago? I swear it seems to be working better and better with every scrape.
CRITICAL: We REALLY need a better way to set up campaigns. With that credit system, I'm back in there setting up a campaign every hour or so. The ideal setup:
I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used, pause it until the next day, then pick up where it left off.
That's the only real issue I see with this. Other than that - this is INCREDIBLE
I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.
So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc. Then, roughly once an hour you'll get a fresh list of links delivered.
This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.
If this was more of a set it and forget it solution - I think it would be a game changer. Don't get me wrong - I'm already on board, I love this system. But I think it could be a lot better if it were more automated.
@Crownvic - Looking into it, thanks for the report
@almostbasic and @dariobl - Please see our update post
@backlinkaddict - Thanks so much for the great review pal!
Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.
I would like to see a feature added to the product. Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.
Also, I am very interested in the pricing for this after beta testing is completed. I hope it will be reasonable and affordable.
@BanditIM
Love the service you are providing and keep up the good work.
Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.
DropBox
Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience
Scraping
I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below
On an average each search term gets around 120 results that's great too.
GSA SER Submissions Of Scraped Targets
Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.
Enhancements If Possible
1) Queue Updates While Job Running
Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue. Possibly displaying number for jobs in queue before mine will be more informative than the current situation.
2) Keep The Inputs Alive In Create Campaign
In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.
So a possible Solution(for your developers to look into).
You can get even better solutions with jQuery without server side postback if you wish me to suggest.
Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.
Am much more interested in the beta test, to see if it can out perform SB, then I will surly give my review here without partiality