Skip to content

Is GScraper a must?

I don't have no scraper and my GSA just scrapes for sites it self, should i buy GScraper?

right now what i do is Churn and Burn and use Public Proxies, I want to know if this is going to help me in any way?

Comments

  • s4nt0ss4nt0s Houston, Texas
    It's really up to you. If you feel you're getting enough links with SER scraping, then an external scraper wouldn't be needed. If you want to scrape your own lists then grab SB or Gscraper. 

    Or you can always buy lists. Some lists cost the price of SB or Gscraper but SB and Gscraper require decent proxies to scrape.
  • +1 what S4nt0s wrote.
    Gscraper is very powerfull harvester but to benefit from it you will need $50-100 a month for proxy service + small dedicated server for harvesting only or medium dedicated server so you can run ser and gscraper on same server. 

    Even if you have money for this you may also need additional vps/small dedi + additional ser + cb license for harvested lists processing. Better stick with ser for harvesting or buy lists if you dont have many websites/clients.
  • Any recommendations on where to get proxies to scrape?
  • I also asked this question and everyone say a BIG YES  :D
  • I wont be able to pay for no proxy, I cant use this software without proxy? Or like PUBLIC Proxies Built in?
  • "built in" proxies cost $66 a month. Its not possible to scrape google without proxy.Good public proxies cost alot.
  • The built in proxies are slow as diarrhea true a shower drain, just get yourself 25 proxies from buyproxy and you will have a great scraper
  • PaulieP, how often will i need to buy these 25 proxies?
  • edited July 2014
    25 proxies from buyproxy... Gscraper would ban such amount of proxies in 5 minutes or faster.  Its not possible to scrape with private proxies if you want to scrape any resonable amout of links without spending mid $XXX for proxies only.
  • i'm still using 30 semi dedicated private proxies to scrape with gscraper. My IP's are not getting banned fast. I can scrape 24/7 @ 5 threads and still get 3k urls a second. I do try to avoid advanced search operators, but yes it's entirely possible to scrape with private proxies.
  • @the_other_dude - You need port scan proxies, i am using 20k proxies (public) and using advanced search operators for more good results.
  • @redfoxseo I know. But I don't have time to hunt down a good provider, or mess with picky people that want certain post counts to use their service etc. One day I will hopefully find someone that sells a good service, or learn to port scan on my own.

    I will note: I've been using foot print factory to generate my footprints, and have been getting more that satisfactory results scraping with text snippet type footprints. I don't do SEO on a commercial scale so you can weigh both options and find what you need based on the level that your work demands.
  • So now this is my question, is there anyway i can buy GScraper for $66 and pay a monthly 10$ for proxies and still get to use the software correctly? I can maybe spare $10 MAX a month on proxies. 
  • Good luck getting proxies for $10...

    I wish I hadn't bought Gscraper, I can do much better with SER alone I'm finding. It takes over a month to scrape 10-20 million list and I doubt I even get a few hundred thousand verified out of that. I have multiple servers processing the lists, but some servers just run SER and I'm getting far more verifieds. Maybe I need to look at  other ways of scraping in case I'm doing it wrong but I've followed many tutorials and used many methods from my Scrapebox days. I really try and hone in on footprints that are for SER engines, obviously, but it's a bit shit.

    As a tool Gscraper is great but proxies are the downfall.
  • edited July 2014
    A way to scrape with Gscraper without any monthly charges would be to have SER search and test public proxies, export to a file, and then make Gscraper import this file every x minutes.

    It really is a good idea to scrape with SER if you can't afford to or don't know how to get good port scanned proxies. SER allows you to scrape lots of search engines (remember to get rid of the worthless ones) and this allows you to scrape a lot more urls before your proxies get burnt compared to scraping with Gscraper. The disadvantage will be SER isn't posting as much as possible.

    @JudderMan, if it takes over a month to scrape 10-20 million urls then you're definitely doing it wrong. It shouldn't take more than hours with good proxies. The key is continously testing and adding proxies outside Gscraper.
  • @fakenickahl you're right....I was scraping at 90klpm but now I'm getting around 300lpm. I have heard reports of the proxy provider I use is overselling, which sucks as they've just increased the price by 50%. And to think I was sticking up for them in another thread....grrr.
  • edited July 2014
    @JudderMan - That was issued when you sell the same proxies to 1000s of customer, your proxies will start dying almost immediately. 
  • Proxy lists are always problem. There will be customers who will share list with X friends or on other forums and proxies will die very fast. Every customer get acces to full list so 1 customer can burn all proxies in theory.

    Idea is quite good, but problem is SER speed as its not software made for proxy testing. All proxy checkers you will find here or at bhw are very slow. Ofc you could make custom script to do this but you will need one time payment for script if you are not able to develop it yourself + server to scrape and test proxies - $30-40 a month or more for server alone.
  • nah sorry but you don't need any custom script for port scanning. just fire nmap from debian or centos shell. It works blazingly fast opposite of its windows version.

    The problem is learning good ip ranges. Scanning ip blocks blindly will get you nowhere.
    I couldnt find any material on this one. I can even pay for a good tutorial about ip ranges.




  • goonergooner SERLists.com
    One of the most important things is finding good footprints, proxies can not be relied on so you need to make sure whatever you are scraping is very targeted. Footprint Factory makes that pretty easy, you can also scrape footprints individually and test results.

    It's a lot of work but worth it in the end.
  • derdor 
    Im not talking about port scanning. Im talking about script for scraping proxies from websites and testing in same way how SER do this but faster. I dont think anyone will teach you how to when it comes to port scanning.
  • Ah i see
    zennoposter's proxy checker is a great tool, very fast, does the job for me.


Sign In or Register to comment.