This service is practically a no brainer for anyone that harvests their own URLS, it takes away the need of having additional proxies/server for scraping and also increases productivity.
@BanditIM sorry about not replying yet, been a busy Monday, will look through the PM and get back to you.
@magically -- Please refer to the beta thread in the original post. We have plans to create a 24/7 Footprint scraping system, which is not quite yet implemented. Completely unrelated to uploading your own footprint lists
@s4nt0s - Thanks so much Mr. Devin for the awesome review and screenshot! We have both your suggestions already on our to-do list, just a matter of time before getting to them.
Even though I don't think people will only enter a few kw and footprints, there is a bug when I entered 2 kw and 2 fps because sw says campaign will use 4000 links
This service really does what it says, it's just like you have a web based version of gscraper, however without worrying about proxies and servers. Interface is very simple but in the same time really userfriendly, it won't take you more than a minute to get it running.
If you do know how to scrape the links and if you had success with gscraper, you'll only get even better results with this service, it's much faster as well.
Pros :
* Userfriendly interface
* No need for proxies
* Fast results
* Automated deduping of duplicated domains and URLs
* Multiple delivery options
* Provides truly good links if you know how to scrape correctly
Cons :
* Service is automatically removing the credits before it scrape the links, that means you won't be able to input huge amount of footprints/keywords, no matter if your final result won't break your limit.
Feature request :
* Enable scrapping from different geo location googles ( google.co.uk, google.de etc ), probably just few lines of code
* Enable filtering the links for custom terms, the same feature like gscraper have.
* When someone use "inurl:footprint" in footprint section, collect only the links which really have that term in the URL
* Filter the links by OBL, i know this feature would slow down the whole thing, but if you could do it i would pay extra for sure, because SER is checking that really slowly.
Bugs :
So far i only had two issues, first one is that i couldn't get the confirmation email so i had to register again with another email, but that's probably your SMTP issue and the second one is that one scrapping campaign was returned as empty, there were no links inside the text files in archive, but nothing major.
That's it for now, i hope i'll be able to use the service for few more days, but i will definitely become your customer once this goes live for commercial purposes, all link sellers here can pack their bags after this goes live .
@SuperSEO - It's already being looked into. We have other larger bugs we're sorting out, but that one is an easy fix. Will be fixed by the end of the night.
@Crownvic - We're fixing it . Thanks so much for the great review!
@dariobl - Thanks SO much for the long and thorough review pal! We found out the email registration was due to Aweber blocking your email address (spammer huh? :P). As for the 0 results, haven't heard of that yet, definitely will be looking into your campaigns.
I've wrote down your suggestions! What do you mean in regards to: "Enable filtering the links for custom terms, the same feature like gscraper have."
@fng - If they don't return the full 1000 links back, we give you the credit back on the ones it didn't return
I used my company email address for first registering attempt, so its definitely not my email address issue , but no issue I got it registered with anoher address.
Regarding filtering feature, I meant to allow us to export only the links with specific words in the URL for example to export only the URLs with .co.uk in the name.
- Very clean interface and incredibly easy to use. Seriously, this has to be one of the easiest platforms I ever used. It's also very straightforward: you just upload your keywords and footprints and that's it. No learning curve at all.
- Scraping seems pretty fast.
- Great delivery format (3 lists; the entire list, unique domains and unique URLs). No need to sort the lists and deduplicate them.
- Awesome integration with Dropbox (although I haven't yet used that feature).
Still some bugs like the copy/paste bug, only able to scrape 2.4 instead of 24 million urls and the inurl footprints don't seem to give the desired results (possible blocking advanced operators?), but the service has a lot of potential!
There needs to be some way to upload a ton of keyword+footprint combos in one go. Uploading little chunks is too time consuming. Allowing the user to control the amount of scraped URLs would be nice to. IE being able to select 100 URLs scraped instead of 1000 all the time.
We currently have a pretty large bug found from one of the user's uploaded list (something to do with a funky character). We're trying to find out what keyword is causing that exactly, so we may be down until tomorrow morning unfortunately. Sorry for the inconvenience - this is what beta testing is for though
I tried service yesterday and so far I must say job done well, I think it have a potencial. Scraping is fast, cheaper and beginners friendly.
So, I am pretty sure this service will be perfect for users with limited resources becouse: - no need for scraping software (Gscraper,Scrabox,Hreferer,..) - no need for VPS - no need for proxies - cheap - fast - easy to use
Scraped URLs - results can be sent to e-mail automaticly after task is completed or uploaded to FTP or DropBox. I also like feature of sorting the results, like "full-list" "unique domains" "unique urls"
I have only one question, what does "Enclose Footprints:" means?
@vuli - It means if you did not enclose your footprints with quotes beforehand (i.e. - powered by Wordpress), this checkbox would make it: "powered by Wordpress".
Comments
@BanditIM sorry about not replying yet, been a busy Monday, will look through the PM and get back to you.
@magically -- Please refer to the beta thread in the original post. We have plans to create a 24/7 Footprint scraping system, which is not quite yet implemented. Completely unrelated to uploading your own footprint lists
@s4nt0s - Thanks so much Mr. Devin for the awesome review and screenshot! We have both your suggestions already on our to-do list, just a matter of time before getting to them.
Thanks everybody!
Even though I don't think people will only enter a few kw and footprints, there is a bug when I entered 2 kw and 2 fps because sw says campaign will use 4000 links
On your post it states "However, for beta testing, you will be given 24 million links FREE each day until the end of beta testing!"
When I log in it says I only get 2.4 million links. Can you please look into this.
"You have 2400000 Request Remaining. Buy More?"
@BanditIM
@Crownvic - We're fixing it . Thanks so much for the great review!
@dariobl - Thanks SO much for the long and thorough review pal! We found out the email registration was due to Aweber blocking your email address (spammer huh? :P). As for the 0 results, haven't heard of that yet, definitely will be looking into your campaigns.
I've wrote down your suggestions! What do you mean in regards to: "Enable filtering the links for custom terms, the same feature like gscraper have."
@fng - If they don't return the full 1000 links back, we give you the credit back on the ones it didn't return
I used my company email address for first registering attempt, so its definitely not my email address issue , but no issue I got it registered with anoher address.
Regarding filtering feature, I meant to allow us to export only the links with specific words in the URL for example to export only the URLs with .co.uk in the name.
Hi
Tried the beta version for the first time today. This is my impression
- a nice clean interface
- very intuitive and easy to use
- I loaded my campaign and within probably about twenty minutes I received a notification email for my results
- I did not experience any technical issues and the scraper delivered the URLs as promised.
So far I am impressed and happy with the Scraper!
Thanks!
Otherwise it looks nice!
PM me
Scraping is fast, cheaper and beginners friendly.
So, I am pretty sure this service will be perfect for users with limited resources becouse:
- no need for scraping software (Gscraper,Scrabox,Hreferer,..)
- no need for VPS
- no need for proxies
- cheap
- fast
- easy to use
Scraped URLs - results can be sent to e-mail automaticly after task is completed or uploaded to FTP or DropBox.
I also like feature of sorting the results, like "full-list" "unique domains" "unique urls"
I have only one question, what does "Enclose Footprints:" means?