Skip to content

How do I Limit Download Attempts???

edited April 2014 in Need Help
I'm running a project using a verified list and working private proxies. In the log I see the following:

download failed (proxy: proxyIP:proxyport) http://domain.com/1

SER will repeatedly attempt to download the same failed url using the SAME proxy - over and over and over.

In my project I have not checked the option to repeatedly post to failed urls.

How / where can I tell SER not to repeatedly attempt to download failed urls? Also, why would it repeatedly use the same proxy???

Comments

  • SvenSven www.GSA-Online.de
    it should not use the same proxy, but it can use the same url depending on where you get targets from. Also happens from engines with fixed urls.
  • It is using the same proxy repeatedly for the same url. I'm running a pligg bookmarking run using global sites list. I only have pligg in list. Eventually it appears SER got through all the urls, and eventually just kept trying to download the one failed url with the same proxy. I never received the out of targets message.

    Is there an option to tell SER to attempt downloading failed urls x times?

    Thanks!
  • SvenSven www.GSA-Online.de
    hmm no but i will have a look again on getting new proxie if that last download faild.
Sign In or Register to comment.