Hey, @Hunar, for the hell of it, why not set up a few clean test projects as I mentioned above, and turn off the real projects, and give it a whirl on those bad servers. Just to see for yourself if it makes a difference.
The pisser is that I just tried a server out of the Netherlands the last two days, and almost zero download failed proxy errors. Then I reverted back to my server in France, and I have them like crazy. Same version of SER. How insane is that?
Has anyone stopped GSA then clicked on the website it was serving when the error occurred?
You will see that the website wont load. It means its either down, slow or having other problems. The timeout occurs but GSA doesn't know how to handle it except with the error your seeing.
Time to add an option to GSA if there isn't one already that says if a certain amount of time passes without a response SKIP IT!
Thats really whats happening here. Got nothing to do with Proxies. Its all about the the site being accessed.
Yeah possible.. Restart your GSA and computer and then try to run GSA without proxy (just for a minute !!) to see the results. ..check your proxies with scrapebox and you will see how fast they are...
In the past I had problems with xrumer while home ISP limited my number of connections per minute.. So if you run GSA at home you really need to get VPS do dedi server..
@mikie Thanks for the comment but that was the very first thing I did. Clicked on the site to see if it loaded. Did this with over 200-300 Url's they all loaded just fine.
@macco Thanks for your comment as well but did you read the thread at all or just comment?
@ron I know that's what's so wierd about this. I have a few servers that are doing great. Working 100% normal than other servers no matter what I do or try Or which proxy provider I try everything is download failed.
Don't know if this matters at all. but It's mainly only happening with lists. I've been scraping my own lists and distributing the verified between all my servers and I get tons of download failed.
So i tested on a few servers of turning on the Search engines and letting SER scrape and post. I got very few download failed. So than I turned it on all my servers and same thing very few download failed and LPM back into normal range. Has anyone else noticed this at all?
@Hunar - I just ran a quick test and yes it seems to be the same for me, i only let SER run for 10 mins with no lists but the download failed seemed to be cut down a lot, so my first impression is that the problem is with running lists.
Before anyone says the list themselves are bad, i have almost every list sold on this forum so they can't all be bad.
it's the same with freshly scraped lists via Scrapebox/Gscraper. I haven't tried with letting SER scrape by himself the target urls, because either way, this is waaay too uneffective no matter the results. I'm having about 3 times less verifications than what is usual
Well With Lists and Download Failed 2-5 LPM lol. With letting SER scrape 75-100LPM at least it's somewhat of a solution until Sven comes back from vacation and fixes this. Since he won't be back till June 10th.
Okay I asked my VPS provider to reinstall my VPS, I installed a fresh version of GSA and made new projects and I even switched the proxies, but I still get this error.
Comments
Has anyone stopped GSA then clicked on the website it was serving when the error occurred?
You will see that the website wont load. It means its either down, slow or having other problems. The timeout occurs but GSA doesn't know how to handle it except with the error your seeing.
Time to add an option to GSA if there isn't one already that says if a certain amount of time passes without a response SKIP IT!
Thats really whats happening here. Got nothing to do with Proxies. Its all about the the site being accessed.
@macco Thanks for your comment as well but did you read the thread at all or just comment?
@ron I know that's what's so wierd about this. I have a few servers that are doing great. Working 100% normal than other servers no matter what I do or try Or which proxy provider I try everything is download failed.
So i tested on a few servers of turning on the Search engines and letting SER scrape and post. I got very few download failed. So than I turned it on all my servers and same thing very few download failed and LPM back into normal range. Has anyone else noticed this at all?
Before anyone says the list themselves are bad, i have almost every list sold on this forum so they can't all be bad.
I'm having about 3 times less verifications than what is usual