@convert2seo - Try uploading files instead of using the type-in fields. I believe that's the bug everyone is encountering and some uses are getting campaigns in just fine by uploading now.
@sarav - We have plans to include a standalone identify/sort for GSA lists that will automatically be ran after results come back... this will save you LOTS of time
We have fixed all bugs from today (despite it being Thanksgiving holiday). Scraper is running at full speed and you are able to use the text fields instead of uploading again.
Due to the bugs, we will be pushing out a more advanced scheduler hopefully by tomorrow night. The current scheduler simply takes away all credits until you replenish credits (better than the current system that only allows you to upload as many search terms as your credits allow). The advanced scheduler we are implementing will allow you to Play/Pause campaigns, as well as schedule "Create X links per day" for each campaign. It'll give you MUCH more control over your campaigns
okay this could be game changer for me . for 24mn links scraping (ignoring dups etc) takes me around 2-3 hours of public proxy scraping and then a day of scraping. Then 5-10 mins of deleting dups . With this service I can either cancel my scrapers vps or use it for other purposes. Huge money + time + resourse saver .
@KevinRB - Once we push our big update out about the scheduling campaigns we will open up purchasing credits. During the beta phase still we will be giving discounted links, so it may be in most people's interest to stock pile on these discounted links while they can
@Seljo - Darn, we fix one bug and it causes another... I'll have the dev look into it.
@sarav - GSA can post to IP address URLs though. I'll write it down though on our todo list.
I haven't yet uploaded an official review but I have to say... you can scrape usable URLs in about 1 hour in what takes me or would take me 24 hours and probably a lot more to do. This is just flabbergasting.
waho i just have a hand feel of this tool this is something out of this world you don't need to spend money in getting server and proxies for scraper list
i just played with it doing a test drive got about 22,000 unquie domains and about 58,000 urls this is great far better than scrape box and gscraper
i just uesed it this morning it is a BOMB, just as the developer said
you don't need server or proxies to run this muster (tool) very easy to
understand and it does what it says
have this to say, THE END OF SCRAPEBOX AND GSCRAPER AND OTHER SCRAPER IN THE MARKET HAS COME
but
one recommendation I have is that it should be made a monthly fee not
credit cos any one can easily finish up the credit in one campaign
also
the footprint should be preinstall in the tool since itis meant for
scraping gsa footprint then it should be be stored there apparently such
that user can edith add or remove or modify it to there taste
[My English language is very weak so sorry about this]
I just use this Scraper last night so here is my result. I used around 24000 Keywords and 1 Footprint for demo. There are still 16376324 Links Remaining on this demo package Here is what i got> Time Start -- Time End 2014-11-28 09:03:37 -- 2014-11-28 11:51:15
Links I got> 7623676 links found. 4817028 unique links found 1183102 unique domains found
Complain> First of all I did not get verification email but then i just apply for forget password then i receive verification email and random password. But now there is no option for changing password.
This is just unbelievable I scraped just in 30/40 minutes a list that I would do in a month with Scrapebox. This is not normal. All I can say, thanks for testing your tool. It's a tool that a lot of people willing to pay for. I'm 100% SURE.
thoughts and review. I must say i like this service alot.
Havent been scraping much for SER as i wanted to use the inbuilt scraper, but i ive scraped quite alot for scrapebox and ultimate demon in the past. Its time consuming and gets messy with all the processing and keeping track of all the files needed. This clean and simple, and with the nature of SER`s target list capabilities. I simply run the scrape, take the output and feed it to the software, and im done. Im gonna use this service alot in the future
Comments
Any indication yet as to the real launch date?.
Regards
Kevin
okay this could be game changer for me . for 24mn links scraping (ignoring dups etc) takes me around 2-3 hours of public proxy scraping and then a day of scraping. Then 5-10 mins of deleting dups . With this service I can either cancel my scrapers vps or use it for other purposes. Huge money + time + resourse saver .
@BanditIM
Option to exclude urls from no domains( ipaddress) will be great.
i just played with it doing a test drive got about 22,000 unquie domains and about 58,000 urls this is great far better than scrape box and gscraper
have this to say, THE END OF SCRAPEBOX AND GSCRAPER AND OTHER SCRAPER IN THE MARKET HAS COME
but one recommendation I have is that it should be made a monthly fee not credit cos any one can easily finish up the credit in one campaign
also the footprint should be preinstall in the tool since itis meant for scraping gsa footprint then it should be be stored there apparently such that user can edith add or remove or modify it to there taste
I just use this Scraper last night so here is my result.
I used around 24000 Keywords and 1 Footprint for demo.
There are still 16376324 Links Remaining on this demo package
Here is what i got>
Time Start -- Time End
2014-11-28 09:03:37 -- 2014-11-28 11:51:15
Links I got>
7623676 links found.
4817028 unique links found
1183102 unique domains found
Complain>
First of all I did not get verification email but then i just apply for forget password then i receive verification email and random password.
But now there is no option for changing password.
Should have been made a long time ago! Lazy bugger @BanditIM
Did a test with Korean KWs. Scraper Bandit is a super app.
One suggestion. Add a sieve filter like Hrefer.
Sieve filter = Scraped links are checked against the sieve filters. Anything not matching the sieve filters will be discarded.