@Hunar - It's probably what you guys are saying - SER is having problems processing a verified list properly. Maybe it is timing.
When SER scrapes, it takes time to find a target and then 'feed' it to SER. My gut feeling is that a list has too much food, there's no wait time for the targets, and it's getting shoveled down SER's throat too fast.
Yeah explains a lot. Their list is good, I know that for a fact. I think SER is just having a problem handling lists right now. Which explains other things as well such as why not everyone is seeing this problem cause not everyone uses a list. A lot of people just let SER scrape and post.
I am encountering the same problem.. I am uploading a list that I know has working sites in it yet 90% of them are showing as download failed, I hope sven puts this problem at the top of his to do list when he gets back from vacation because right now my LPM is only 1/10th of what it could be.
Most of the download failed sites load fine in my browser it's just gsaser that gets blocked and I do think this is hosts blocking the way ser loads the sites or the ip's in the proxies are on a blacklist that gets sent to a 404 page or domain does not exist error.
So far swapped out over 300 semidedicated proxies across 3 providers with no change.
Getting between 0.20 and 6 LPM down from 400 Lpm before.
Sorry Moonshine, We know that the problem is with how SER handles lists right now and it's causing the Download Failed. Good news is we figured out what is causing the issue bad news is nothing we can do until @sven is able to release a updating fixing it.
I'm not sure if that's it. But test it. I've tested it More than a 100 Times. When I let SER scrape and post Works fine. As soon as I load in ANY list either from Verified that SER has scraped itself. Or a List that i've bought from somewhere. I will 100% of the time get download failed.
Oh and glad to have you back Sven, Hope you had a great vacation.
Hey @Sven, glad you are back, and hope you are well rested
We have been testing this every which way you can think of, and it always seems to come down to running a list causes download failed.
If you need a list to experiment, let me know. Running any verified list seems to cause it. Importing it vs. simply marking the folder with the list in projects options (or both) doesn't seem to matter either, although SER does process faster if you do a direct import.
It just started 2-3 weeks ago. This issue was never really an issue before. I would rate this as the most important thing that needs your attention.
Could this problem arise when using bigger amount of threads.. I have been running lists recently and observed it.. only once started to get download failed but it were the proxies that suddenly failed in large amount. Not saying there's no problem.. but as i am using 100 threads i don't feel it. As i know you guys play it big with Ser, maybe using up to 1000 threads is choking Ser..
Lots of you here are also using buyproxies.org.. i feel their proxy quality has gone down a bit, but their support is very fast and helpful to replace them and i will be using them as long as it stays so. Still i have experienced the need to replace proxies quite often lately
I know you guys are trying to help and throw out suggestions. But believe me I've literally tried everything.
@donchino yes I've lowered threads down to a 100 did it decrease a bit Yes. but if that's the case SER is literally pointless for me to use if I go that route, cause I only get like 5-10 LPM if that. I'm sure it's the same with others as well. Also It's not buyproxies or the proxies at all. I spent 500$ on different proxy providers made sure they were coming from different blocks everything. Still saw the exact same result.
@meph Yes that was one of the first things I tried. and still saw the exact same thing.
@ron yes please give me a list to test on. Though I don't know when I am ready to test as I have to take care of so many things right now (the pain of getting back from holiday...after sorting all out you could need another one).
I have done pretty much all of what @Hunar has done, with the exception of the 500$ dropped on new proxies. I have had the same results as he has. I've stopped using SER all together till an update occurs, hoping you find the time to get this worked out. Hope ya had a great vacation! Won't lie though I've been anxiously awaiting your return so an update can happen! lol
@botman are you running lists? or are you letting SER scrape and post? Everyone that is getting the d/l failed seems to be running lists. it's the #1 thing in common.
Comments
@Hunar - It's probably what you guys are saying - SER is having problems processing a verified list properly. Maybe it is timing.
When SER scrapes, it takes time to find a target and then 'feed' it to SER. My gut feeling is that a list has too much food, there's no wait time for the targets, and it's getting shoveled down SER's throat too fast.
Seems like a theory.
So far swapped out over 300 semidedicated proxies across 3 providers with no change.
Getting between 0.20 and 6 LPM down from 400 Lpm before.
Oh and glad to have you back Sven, Hope you had a great vacation.
Hey @Sven, glad you are back, and hope you are well rested
We have been testing this every which way you can think of, and it always seems to come down to running a list causes download failed.
If you need a list to experiment, let me know. Running any verified list seems to cause it. Importing it vs. simply marking the folder with the list in projects options (or both) doesn't seem to matter either, although SER does process faster if you do a direct import.
It just started 2-3 weeks ago. This issue was never really an issue before. I would rate this as the most important thing that needs your attention.
@donchino yes I've lowered threads down to a 100 did it decrease a bit Yes. but if that's the case SER is literally pointless for me to use if I go that route, cause I only get like 5-10 LPM if that. I'm sure it's the same with others as well. Also It's not buyproxies or the proxies at all. I spent 500$ on different proxy providers made sure they were coming from different blocks everything. Still saw the exact same result.
@meph Yes that was one of the first things I tried. and still saw the exact same thing.