Skip to content

scraping using gscraper help with footprint

so the problem isn't with gscraper. i have a problem when i transfer the url to gsa ser. it submits them but i get very low verified rate. if you have any advice please comment below. i really need to fix this. 

Comments

  • Thats is common, i also get low % of verified links from list scrapped with Gscraper.
  • what can i do now? i feel like i wasted money for no reason 
  • Scrape more links, process more links.

    Most of the advanced users have 2-3 servers just for scraping and processing these scrapped lists only.
  • May be you can share what % of verified links you get from your raw scrapped urls?
  • 1linklist1linklist FREE TRIAL Linklists - VPM of 150+ - http://1linklist.com
    If you'll share...

    1. What platforms your posting to
    2. What your GSA configs are with respect to threads, captchas, and proxies

    We can help more :)
  • BuySERListsBuySERLists Vancouver, Canada
    @yazisback you didn't waste your money. You just need to understand that you are going to have to scrape and process a LOT of links in order to build your lists. Anywhere from hundreds of thousands to hundreds of millions. It's not going to be quick unless you have a lot of servers that you can use.

    Focus on a few things...

    1) Really work on your footprints, ensure that you're using quality footprints. Look around the forum there are numerous threads on this.

    2) Really work on your keywords. You can do much better than simply dumping a dictionary into GSCraper

    3) Ensure you have a lot of proxies at your disposal for checking and posting in SER

    4) Make sure you use GSA-CB plus at least one OCR captcha service plus at least one text captcha service

    5) Run your scraped lists through multiple projects as sometimes they will post on the 2nd or 3rd try. You can always dedupe your verified lists later.

    GSA SER isn't some magic button that builds 20 zillion links. You need to use it wisely.
Sign In or Register to comment.