Skip to content

Donwload Failed?

24

Comments

  • ronron SERLists.com
    Hey, @Hunar, for the hell of it, why not set up a few clean test projects as I mentioned above, and turn off the real projects, and give it a whirl on those bad servers. Just to see for yourself if it makes a difference. 
  • with last version 8.51 is all ok
  • Yep I have buyproxies.org and at least 10 of my 30 proxies have been report download failed (proxy: all day.
  • Its got nothing to do with whether they are good or bad. I run the checks again they are all reported as GOOD>
  • My new setup project ran fine for a few hours, but now I get 99% download failed again..
  • Yeah,  Sorry Ron.  I tried it out and still got all download failed. :(

  • It seems like everyone is having problems, I'm having the same problem.
  • Yeah hoping @Sven‌ will be able to fix this before June 1st
  • ronron SERLists.com
    The pisser is that I just tried a server out of the Netherlands the last two days, and almost zero download failed proxy errors. Then I reverted back to my server in France, and I have them like crazy. Same version of SER. How insane is that? 
  • The reason why this is coming up is quite simple.

    Has anyone stopped GSA then clicked on the website it was serving when the error occurred?

    You will see that the website wont load. It means its either down, slow or having other problems. The timeout occurs but GSA doesn't know how to handle it except with the error your seeing.

    Time to add an option to GSA if there isn't one already that says if a certain amount of time passes without a response SKIP IT!

    Thats really whats happening here. Got nothing to do with Proxies. Its all about the the site being accessed.
  • Your proxies are slow or dead (too many threads per proxy)... Send a ticket to buyproxies and they will replace all proxies...
  • it's proxy problem? @macco
  • edited May 2014
    Yeah possible.. Restart your GSA and computer and then try to run GSA without proxy (just for a minute !!) to see the results. ..check your proxies with scrapebox and you will see how fast they are...

    In the past I had problems with xrumer while home ISP limited my number of connections per minute.. So if you run GSA at home you really need to get VPS do dedi server.. 

  • WOW! i just told you guys what the problem was and you go off on a proxy tangent? Seriously, Wghero, did you read my post?
  • ... and i just did the 8.53 update which did not address this issue. I wonder why Sven didnt fix this issue.
  • edited May 2014
    @mikie46 "Proxies Tab" click Proxies should resolve domain to IP.

  • @fEaRz this doesn't help for me, still getting download failed all the way.
  • @mikie Thanks for the comment but that was the very first thing I did.  Clicked on the site to see if it loaded.  Did this with over 200-300 Url's they all loaded just fine.

    @macco Thanks for your comment as well but did you read the thread at all or just comment? :/

    @ron I know that's what's so wierd about this.  I have a few servers that are doing great. Working 100% normal than other servers no matter what I do or try Or which  proxy provider I try everything is download failed. :(
  • It's not proxy problem. I input a few urls showing download failed in browsers with proxy and sites loaded fast.
  • edited May 2014
    @Kenny @mikie46 Please, optimize your TCP/IP Windows (XP, 7, Server) to handle large number or connections without packet loss -> http://www.speedguide.net/articles/windows-7-vista-2008-tweaks-2574
  • What packet loss? Im not getting packet loss at all and I wouldn't know where to start with the document.
  • I'm still getting really low success rate.
  • goonergooner SERLists.com
    I don't think it's packet loss, my servers are optimised and i am having the same problems as everyone else.
  • Don't know if this matters at all. but It's mainly only happening with lists.   I've been scraping my own lists and distributing the verified between all my servers and I get tons of download failed. 

    So i tested on a few servers of turning on the Search engines and letting SER scrape and post.   I got very few download failed.  So than I turned it on all my servers and same thing very few download failed and LPM back into normal range.   Has anyone else noticed this at all?
  • @Hunar I always let SER do everything on my pc and still have massive download failed.
  • goonergooner SERLists.com
    @Hunar - I just ran a quick test and yes it seems to be the same for me, i only let SER run for 10 mins with no lists but the download failed seemed to be cut down a lot, so my first impression is that the problem is with running lists.

    Before anyone says the list themselves are bad, i have almost every list sold on this forum so they can't all be bad.
  • it's the same with freshly scraped lists via Scrapebox/Gscraper. I haven't tried with letting SER scrape by himself the target urls, because either way, this is waaay too uneffective no matter the results.
    I'm having about 3 times less verifications than what is usual
  • Well With Lists and Download Failed 2-5 LPM lol.  With letting SER scrape 75-100LPM  at least it's somewhat of a solution until Sven comes back from vacation and fixes this.  Since he won't be back till June 10th.
  • Okay I asked my VPS provider to reinstall my VPS, I installed a fresh version of GSA and made new projects and I even switched the proxies, but I still get this error.
  • Do you use a list Kenny or do you let ser scrape and post?
Sign In or Register to comment.