download failed is normally a proxy issue. If you are using public proxies get rid of them and buy yourself some private or semi private proxies. You will not regret it.
Chymmi
And how much proxies do I need for 1500 threads ? 1500 ?
Brumnick
edited March 2013
No idea. That seems an overly aggressive number of threads to be honest. What are your goals in terms of links built per day?
Edit; I build (submit) between 70K and 100k links per day, using 90 threads and 10 private proxies.
Chymmi
I bought now 30 proxies, so lets see how better it will be
Chymmi
Anyway if you got 100K links per day with 90 threads, then **** public proxies suck
Brumnick
Haha, fair play to ya. Keep the threads low and build up if I were you....
Whilst you are at it, get GSA CB, I doubled my daily output after changing from CS to CB.
Brumnick
Public proxies are the devil.
Ozz
edited March 2013
you don't need that much threads when using private proxies.
your machine and proxies won't handle 1500 threads anymore with privates.
the reason for that is that a thread that is bounded to a dead proxy does exactly nothing more than to wait that the html timer counts down to zero before the thread will be used for another task.
buy 10 proxies for up to 100 threads and 20 if you want to make sure that everything runs smoothly and see how many threads your machine can manage.
edit: i should've read the whole thread . 30 proxies are more than fine.
AlexR Cape Town
Are they public or private?
How many SE's you selected?
What's SE wait time between searches?
All will impact the number of proxies you need!
ComputerEngineer
i don't use any proxies at least for downloading actually i disabled all proxies and still failing
and tried with even lowering the thread count to 20 i have 50 mbit fiber connection
definitely software bug. they are working via firefox
ComputerEngineer
still failin even with this settings
Ozz
edited March 2013
maybe something is blocking your IP? check your antivirus and firewall settings.
edit: isn't one thread not enough? to post your issues in multiple threads doesn't help anyone. furthermore it makes Sven angry
ComputerEngineer
@ozz do you really read what i write ? firefox and software uses same ip
it failed but double click to open at browser and tada works
Ozz
edited March 2013
what i meant is if the connection of SER to the internet is blocked by something.
did you've checked that and read what i've wrote?
Ozz
edited March 2013
what i forgot to ask. did this issue occur today with the latest version or did it occur out of nowwhere? if everything worked yesterday than roll back to previous version to see if its working again.
ComputerEngineer
@Ozz it is setup in vmware windows xp and no firewall or antivirus is installed i dont know but latest version may be the issue
but i am 100% sure that it is software bug because my own c# 4.5 software validated every one of the urls exists that i have put into campain
Ozz
then roll back to latest version and see if its working again. the .exe to previous version is in your installation folder.
ComputerEngineer
edited March 2013
@Ozz if i setup proxy use for page rank check, does it also use for download those proxies ?
and can i disable page rank check ?
Ozz
edited March 2013
i don't understand that question to be honest.
disable your PR filters in your projects options. no PR filter = no PR check.
ComputerEngineer
@Ozz i am not talking about pr filter. the software checks at least every submission page rank automatically even if i don't put any pr filter.
i don't want it check pr at all.
ComputerEngineer
yep definitely software problem.
if i set 20 threads, the number of failed is low. i just set it to 80 and wow so many failed ones.
but i have 50 mbit fiber and able to fetch results from google with hundreds of threads via my own c# 4.5 softwares.
here the image that clerly explains everything
no proxy enabled and time out set to 120 seconds
and all of these links are verified via my own c# 4.5 software every one of them
ComputerEngineer
and when i check those failed ones via firefox it works perfectly on the same machine same ip
ComputerEngineer
Ok problem is most likely caused by the bug at timeout settings
even if i set 600 seconds timeout, after 10 seconds it is saying download failed and at firefox it works perfectly that failed site
LeeG Eating your first bourne
You cant set a 600 second time out. Max it will go to is 180
@ComputerEngineer Do you have GSA Indexer running in the background?
Chymmi
LOL LOL tried private proxies and results are amazing, you have to add to GSA big title that all users have to get private proxies xD I was using gsa ser only with public proxies, now imagine how much backlinks do I lost...
ComputerEngineer
@Chymmi i fixed that problem via harvesting urls with my own software
yes I also bring all my lists (verified, succ.,identified) to gscraper, removed duplicate urls, removed low pr and now I got really fast submission only high PR ..I am happy
Comments
actually i disabled all proxies and still failing
and tried with even lowering the thread count to 20
i have 50 mbit fiber connection
definitely software bug. they are working via firefox
firefox and software uses same ip
here example
19:11:18: [-] 2576/3205 download failed - http://babyshower.tan-desmond.com/page/46/
it failed but double click to open at browser and tada works
i dont know but latest version may be the issue
but i am 100% sure that it is software bug
because my own c# 4.5 software validated every one of the urls exists that i have put into campain
and can i disable page rank check ?
i don't want it check pr at all.
if i set 20 threads, the number of failed is low. i just set it to 80 and wow so many failed ones.
but i have 50 mbit fiber and able to fetch results from google with hundreds of threads via my own c# 4.5 softwares.
here the image that clerly explains everything
no proxy enabled and time out set to 120 seconds
and all of these links are verified via my own c# 4.5 software every one of them
even if i set 600 seconds timeout, after 10 seconds it is saying download failed
and at firefox it works perfectly
downloading windows server 2012 to try there
tried private proxies and results are amazing, you have to add to GSA big title that all users have to get private proxies xD I was using gsa ser only with public proxies, now imagine how much backlinks do I lost...
@doubleup no i don't have that