@meb - just point SER to the folder itself that contains the URLs, or individual files. A lot of people just point it to a single file that they keep scraping with and ScraperBandit will continually append to that file.
-- you can import your own lists into SER and have SER use those lists individually I think... I'm really not the expert to ask though, I don't really use SER anymore. You'd be better off opening a thread and asking people who know what they're talking about :P
All I have to say is this is a really really great tool and saves a lot of the extra hassle involved with scraping new targets, it takes the hassle out of having to buy all the extras that are usually needed to operate. No more messing around with multiple tools highly recommended you give it a try . Cheers lads.
Yes its great, working great now, thanks BanditIM!
So in order to get GSA to scrape, is there like an advisable no of which footprints to scrape a certain number at one go, so that GSA wont run out of targets so soon?
REVIEW: I started using ScraperBandit on Friday and have been using it throughout the weekend and it's honestly pretty great. It's so simple to use that you might find yourself looking for more stuff to input.
The hassle-free proxy part is probably the best thing about it since you're also saving money. I've scraped about 20,000 links so far and I'm going to do about another 20,000 today.
@Mylop - PayPal is having issues with their IPNs the past 24 hours. We're trying to force a check for the payment right now, should be fixed by this evening and you'll have your credits.
At first i tryed the 10 proxys i was like these suck ,
Then i bought the 25 and still thought they sucked. Then i noticed i had use server proxys on and that changed the WHOLE game around, instead of using scrapebox proxys i was using there and they go the speed of light!
I used buy proxy before i bought 10 dedicated and they are WORTHLESS using advanced operators they stoped at around 12k urls in 1-2 mins. But This proxy on the other hand showed me this im using ADV operators such as Inurl ect working on a 300k ky list from 72 of my normal kw
Will update tomarrow huge vouch !
Matter of the fact i am emailing buyproxys for a full refund !
Aslong as this quality stays the same in the future i wouldn't mind buying 1000 or more proxys.
@naruto12900 - Glad you got them working like they should, that is definitely the results we see too when scraping Google . You can contact me on Skype at googleboy507
Are there instructions/video showing how exactly to set GSA SER to access the linklist daily so that as the list is updated daily, GSA SER would automatically load up the new targets and post to them without needing for me to manually do this?
Ok, so I took the dive and signed up. It always amazes me when I see a product advertised as being able to do this and do that and I go and sign up and it actually delivers on the promises made. (So many don't) I was already a member of the proxybandit service which worked better than all the other backconnect services I've used so I didn't hesitate to sign up for scraperbandit. In all fairness, the other softwares in this space do deliver on their promises, however there are 1 too many moving parts that cause bottlenecks and you spend 1/3 of your time troubleshooting and problem solving. In my opinion, this is where Scraperbandit really shines. The process couldn't be simpler and simple for me is always better. Neil, congratulations on this one, you knocked it out the park with Scraperbandit
@CecilDee - thank you so much for the kind words .
In regards to your question above, you just need to use the Dropbox feature on Scrapebandit, and then have SER read from that folder that has your scraped results always being updated.
I know this has been answered somewhere on this forum but I cannot find it. How exactly can one set SER to read from the Dropbox folder with the updated scraped results?
I have been having trouble with with two scrapes in particular - trying to get some PHPLD directories, and also some expired Blogspot domains. For some reason both my Scrapebox and GScraper installations refused to return any results despite proxies working and all other similar scrapes being fine.
I tried out Scraper Bandit and I had to pay the $9 for the extra links as the demo only allowed me a few hundred keywords. It processed both scrapes in a few hours - 10k keywords in all and that brought me back over 1 million links on the larger scrape!
One issue I found is that the duplicate filter doesn't work properly. I downloaded just the unique domains and I got a very small list and that still had tons of URLs from one domain. So I've had to download the full list, split it in half (as Scrapebox can only handle 1mil at a time) and I'm filtering in SB.
It's good though - those two scrapes took up 2/3 of my $9 purchase so $3 in all. I had tried to get one of them done in a Fiverr gig yesterday but the guy cancelled the order at the last minute!
Anyway, I'm very happy with it and would definitely use it for any scrapes that my machine doesn't want to do!
@cmiddlebrook - Could you please PM me the campaign name that the unique domain filter didn't work on? Never had anyone else complain about this before
I finally got around to trying this thing out. I really, really like it. It's easy to setup and use. Plus, you don't have to mess with proxies or anything. I recommend you give it a try.
User below has a complaint that I will address below the image:
To address this -- myself and my dev confirmed the user above exhausted all their credits A YEAR AGO when they used this system. Due to not only this being an entire year ago, but since their credits were exhausted I was not willing to provide a refund.
Comments
I purchased some links but it hasnt updated on my account as yet.. its been about an hr
The hassle-free proxy part is probably the best thing about it since you're also saving money. I've scraped about 20,000 links so far and I'm going to do about another 20,000 today.
We're trying to force a check for the payment right now, should be fixed
by this evening and you'll have your credits.
At first i tryed the 10 proxys i was like these suck ,
Then i bought the 25 and still thought they sucked. Then i noticed i had use server proxys on and that changed the WHOLE game around, instead of using scrapebox proxys i was using there and they go the speed of light!
In regards to your question above, you just need to use the Dropbox feature on Scrapebandit, and then have SER read from that folder that has your scraped results always being updated.
The link is dead - http://scraper.banditim.com/
To address this -- myself and my dev confirmed the user above exhausted all their credits A YEAR AGO when they used this system. Due to not only this being an entire year ago, but since their credits were exhausted I was not willing to provide a refund.