Skip to content

Getting a "download failed" on a live site (Drupal)

edited March 2013 in Need Help
I've been using GSA SER for a couple of weeks so far and with OK-success. However I really want to improve the way I feed scraped databases of drupal/joomla/buddypress to it though. I have pretty big lists of sites, but I have very low submission success rate.

I have paid attention to 1) proxies 2) threads 3) site sizes 4) html timeouts. Tweaked them or tried to remove from equation.

So I am running a test right now, 1 thread, my own IP address (no proxies), and I am parsing a Drupal site list. Getting a lot of "download failed" results so I open one of them and it loads just fine and very fast in my browser (so same IP). What coould I have missed?

I have read this thread  and know how to use Google :) I am more or less pleased with autopilot scraping and posting, I want to hear from someone who loads big target lists with success. Thanks!

Comments

  • begin to use proxies. also (this has nothing to do with download failed, but nevertheless) pay attention to the length of your chosen passwords, because there are different restrictions on several engines. (they are not the same in drupal / joomla etc.)
  • edited March 2013
    I am using proxies, tried public, tried private, then tried my bare IP. Site gets loaded in the browser but failed in SER. If the proxy/IP was to blame I would not be able to load the site in my browser?

    Thanks for the password length tip!
  • AlexRAlexR Cape Town
    @bekkolt - what platforms restrict password lengths?
Sign In or Register to comment.