I just tested out this service and I have to say it is amazing. You will save a lot of money by using this service as you will not have to pay for the following:
-dedicated server $60+ -proxies $40+ -software $70
All you have to do is upload your footprints and keywords and it will automaticaly scrape for you. It finished a 13 million URL scrape in a few hours. No need to worry about anything.
I was one of the first beta testers - what was that like 2 days ago? I swear it seems to be working better and better with every scrape.
CRITICAL: We REALLY need a better way to set up campaigns. With that credit system, I'm back in there setting up a campaign every hour or so. The ideal setup: I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used, pause it until the next day, then pick up where it left off.
That's the only real issue I see with this. Other than that - this is INCREDIBLE
The way you recommend would cause a huge server overload, i'll rather stick with the current system but have a stable platform, than wait tens of hours to get a single campaign finalized
Anyway i just want to say once again that this service is pure goldmine, i got more contextual links last 24 hours than in last 2 months purchasing various lists by various providers, wasting my time and money.
I just hope that the price will be affordable for this service too
I would pay $100 or a little bit more per month for what we have right now, 24m links per day, even if it would be less links, as i'm definitely not exporting that much because SER can't take it all Even 10m-15m would do the job definitely.
@dariobl - see I think it would do quite the opposite. By having scheduled campaigns, they will know how much demand there is, and can schedule things accordingly. Right now - if a bunch of people go in at once and launch massive campaigns, then it's going to get hammered all at once.
I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.
So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc. Then, roughly once an hour you'll get a fresh list of links delivered.
This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.
If this was more of a set it and forget it solution - I think it would be a game changer. Don't get me wrong - I'm already on board, I love this system. But I think it could be a lot better if it were more automated.
@Crownvic - Looking into it, thanks for the report
@almostbasic and @dariobl - Please see our update post http://banditim.com/blog/scraperbandit-update-1/ Should be ready by tonight! Dariobl, you will be able to use the system as it is now by simply creating small campaigns. For users like almostbasic, he will have the option now to submit large campaigns and not have to worry about constantly loading in.
after 2 days of testing it I have to say it is really great. I think it would be better to make "scheduled campaigns". So we could deliver many keywords and footprints and just set to deliver some results every day. Thanks to that we would not have to login and make some work over and over again. Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.
I would like to see a feature added to the product. Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.
Also, I am very interested in the pricing for this after beta testing is completed. I hope it will be reasonable and affordable.
As for seeing keywords/footprints used, we have this feature already but we took it off now for simplicity sakes. It'll be added soon enough.
Pricing - Until we see the demand and search terms users give us (hard or easy footprints for example) I have set the price at $5 for 10 million links.
Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.
DropBox
Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience
Scraping
I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below
Non De-Duped = 2.2 Million
De-duped based on URL = 710K
De-duped based on Domains = 270K
On an average each search term gets around 120 results that's great too.
GSA SER Submissions Of Scraped Targets
Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.
Enhancements If Possible
1) Queue Updates While Job Running
Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue. Possibly displaying number for jobs in queue before mine will be more informative than the current situation.
2) Keep The Inputs Alive In Create Campaign
In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.
So a possible Solution(for your developers to look into).
While Clicking DropBox feature button, do a server side postback and keep those typed in session.
Open the window using a JavaScript popup and once the work is done you may wish to refresh the create page
While reloading put back the data in session in step 1
You can get even better solutions with jQuery without server side postback if you wish me to suggest.
Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.
Problem is GSA SER takes so long to just ingest the 700k targets. It is already more than two hours only 24k is taken. So I can assume it is going to take so long for me to check the GSA SER results.
@convert2seo - Try uploading files instead of using the type-in fields. I believe that's the bug everyone is encountering and some uses are getting campaigns in just fine by uploading now.
@sarav - We have plans to include a standalone identify/sort for GSA lists that will automatically be ran after results come back... this will save you LOTS of time
We have fixed all bugs from today (despite it being Thanksgiving holiday). Scraper is running at full speed and you are able to use the text fields instead of uploading again.
Due to the bugs, we will be pushing out a more advanced scheduler hopefully by tomorrow night. The current scheduler simply takes away all credits until you replenish credits (better than the current system that only allows you to upload as many search terms as your credits allow). The advanced scheduler we are implementing will allow you to Play/Pause campaigns, as well as schedule "Create X links per day" for each campaign. It'll give you MUCH more control over your campaigns
okay this could be game changer for me . for 24mn links scraping (ignoring dups etc) takes me around 2-3 hours of public proxy scraping and then a day of scraping. Then 5-10 mins of deleting dups . With this service I can either cancel my scrapers vps or use it for other purposes. Huge money + time + resourse saver .
@KevinRB - Once we push our big update out about the scheduling campaigns we will open up purchasing credits. During the beta phase still we will be giving discounted links, so it may be in most people's interest to stock pile on these discounted links while they can
@Seljo - Darn, we fix one bug and it causes another... I'll have the dev look into it.
@sarav - GSA can post to IP address URLs though. I'll write it down though on our todo list.
I haven't yet uploaded an official review but I have to say... you can scrape usable URLs in about 1 hour in what takes me or would take me 24 hours and probably a lot more to do. This is just flabbergasting.
waho i just have a hand feel of this tool this is something out of this world you don't need to spend money in getting server and proxies for scraper list
i just played with it doing a test drive got about 22,000 unquie domains and about 58,000 urls this is great far better than scrape box and gscraper
i just uesed it this morning it is a BOMB, just as the developer said
you don't need server or proxies to run this muster (tool) very easy to
understand and it does what it says
have this to say, THE END OF SCRAPEBOX AND GSCRAPER AND OTHER SCRAPER IN THE MARKET HAS COME
but
one recommendation I have is that it should be made a monthly fee not
credit cos any one can easily finish up the credit in one campaign
also
the footprint should be preinstall in the tool since itis meant for
scraping gsa footprint then it should be be stored there apparently such
that user can edith add or remove or modify it to there taste
[My English language is very weak so sorry about this]
I just use this Scraper last night so here is my result. I used around 24000 Keywords and 1 Footprint for demo. There are still 16376324 Links Remaining on this demo package Here is what i got> Time Start -- Time End 2014-11-28 09:03:37 -- 2014-11-28 11:51:15
Links I got> 7623676 links found. 4817028 unique links found 1183102 unique domains found
Complain> First of all I did not get verification email but then i just apply for forget password then i receive verification email and random password. But now there is no option for changing password.
This is just unbelievable I scraped just in 30/40 minutes a list that I would do in a month with Scrapebox. This is not normal. All I can say, thanks for testing your tool. It's a tool that a lot of people willing to pay for. I'm 100% SURE.
thoughts and review. I must say i like this service alot.
Havent been scraping much for SER as i wanted to use the inbuilt scraper, but i ive scraped quite alot for scrapebox and ultimate demon in the past. Its time consuming and gets messy with all the processing and keeping track of all the files needed. This clean and simple, and with the nature of SER`s target list capabilities. I simply run the scrape, take the output and feed it to the software, and im done. Im gonna use this service alot in the future
Comments
-dedicated server $60+
-proxies $40+
-software $70
All you have to do is upload your footprints and keywords and it will automaticaly scrape for you. It finished a 13 million URL scrape in a few hours. No need to worry about anything.
All PMs for beta testers above this post have been sent - enjoy everyone and please PM me any bugs you find!
Thanks in advance
I was one of the first beta testers - what was that like 2 days ago? I swear it seems to be working better and better with every scrape.
CRITICAL: We REALLY need a better way to set up campaigns. With that credit system, I'm back in there setting up a campaign every hour or so. The ideal setup:
I load in a monstrous list of footprints and keywords, and the system will run the campaign until the credits are used, pause it until the next day, then pick up where it left off.
That's the only real issue I see with this. Other than that - this is INCREDIBLE
I think it would be easy to set it up with an option to split up the results by XXX # of links and email / ftp / dropbox the results when it reaches that #.
So, you set up a campaign that could last weeks, and set it to deliver results every 1,000,000 links via email, dropbox, etc. Then, roughly once an hour you'll get a fresh list of links delivered.
This system is incredible - however since I've been using it, I've had it open on my computer all day long, and am regularly in there starting a new campaign, so it's using up more time than I think it needs to.
If this was more of a set it and forget it solution - I think it would be a game changer. Don't get me wrong - I'm already on board, I love this system. But I think it could be a lot better if it were more automated.
@Crownvic - Looking into it, thanks for the report
@almostbasic and @dariobl - Please see our update post
@backlinkaddict - Thanks so much for the great review pal!
Another suggestion: some footprints available in the panel ready to use. So the user would only have to check what platforms he wants to look for, set keywords and click "Start". Of course Scraper community would be able to edit footprints for each platform and share with other users.
I would like to see a feature added to the product. Could you please add a feature that would allow us to see the footprints and keywords that were used in a completed scrape, so we don't use those footprints and keywords again.
Also, I am very interested in the pricing for this after beta testing is completed. I hope it will be reasonable and affordable.
@BanditIM
Love the service you are providing and keep up the good work.
Just finished my testing. Scraping is amazing and deduped in two segments, deduped by url and deduped by domain. So overall great service.
DropBox
Integrating with cloud storage is nice idea. I had my dropbox instance running in local and in gsa server. After some works I am back at my desktop and found the status the job is completed. And I checked my drop box in both the instances. The archive with url is already there. Overall nice experience
Scraping
I yet to evaluate the closeness of this scarper results towards footprints against scrapebox. Over all the numbers are great. I tried with 110 foot prints and 200 keywords so that is the combination of 22k search texts. the result I got from BIM google scraper was as below
On an average each search term gets around 120 results that's great too.
GSA SER Submissions Of Scraped Targets
Will run in GSA SER and update this thread once done. Why I am evaluating this because the foot prints are taken from GSA SER verbatim.
Enhancements If Possible
1) Queue Updates While Job Running
Status was there at 'Keywords Queued' for more than an hour and not processed. Guess because of queue. Possibly displaying number for jobs in queue before mine will be more informative than the current situation.
2) Keep The Inputs Alive In Create Campaign
In Create Campaign screen. After entering foot prints and keywords. There is an option to opt for dropbox feature. When I tried that whatever I have typed in the screen(create campaign) is gone. When i am coming back Retaining whatever I typed will be a great help for us.
So a possible Solution(for your developers to look into).
You can get even better solutions with jQuery without server side postback if you wish me to suggest.
Overall, This service is going to be famous among GSA users for sure. I will keep testing and let you know my views on this system. As Scraping is my first priority for getting better/quality backlinks from GSA.
Am much more interested in the beta test, to see if it can out perform SB, then I will surly give my review here without partiality
Any indication yet as to the real launch date?.
Regards
Kevin
okay this could be game changer for me . for 24mn links scraping (ignoring dups etc) takes me around 2-3 hours of public proxy scraping and then a day of scraping. Then 5-10 mins of deleting dups . With this service I can either cancel my scrapers vps or use it for other purposes. Huge money + time + resourse saver .
@BanditIM
Option to exclude urls from no domains( ipaddress) will be great.
i just played with it doing a test drive got about 22,000 unquie domains and about 58,000 urls this is great far better than scrape box and gscraper
have this to say, THE END OF SCRAPEBOX AND GSCRAPER AND OTHER SCRAPER IN THE MARKET HAS COME
but one recommendation I have is that it should be made a monthly fee not credit cos any one can easily finish up the credit in one campaign
also the footprint should be preinstall in the tool since itis meant for scraping gsa footprint then it should be be stored there apparently such that user can edith add or remove or modify it to there taste
I just use this Scraper last night so here is my result.
I used around 24000 Keywords and 1 Footprint for demo.
There are still 16376324 Links Remaining on this demo package
Here is what i got>
Time Start -- Time End
2014-11-28 09:03:37 -- 2014-11-28 11:51:15
Links I got>
7623676 links found.
4817028 unique links found
1183102 unique domains found
Complain>
First of all I did not get verification email but then i just apply for forget password then i receive verification email and random password.
But now there is no option for changing password.
Should have been made a long time ago! Lazy bugger @BanditIM
Did a test with Korean KWs. Scraper Bandit is a super app.
One suggestion. Add a sieve filter like Hrefer.
Sieve filter = Scraped links are checked against the sieve filters. Anything not matching the sieve filters will be discarded.