Skip to content

Download Failed

edited March 2013 in Need Help
I got a lot of error "download failed" but these sites are working.
What I am doing wrong ?
Tagged:
«1

Comments

  • download failed is normally a proxy issue.  If you are using public proxies get rid of them and buy yourself some private or semi private proxies.  You will not regret it.
  • And how much proxies do I need for 1500 threads ? 1500 ?
  • edited March 2013
    No idea.  That seems an overly aggressive number of threads to be honest.  What are your goals in terms of links built per day?

    Edit; I build (submit) between 70K and 100k links per day, using 90 threads and 10 private proxies.

  • I bought now 30 proxies, so lets see how better it will be :)
  • Anyway if you got 100K links per day with 90 threads, then **** public proxies suck
  • Haha, fair play to ya. Keep the threads low and build up if I were you....


    You won't regret that either.

    Whilst you are at it, get GSA CB, I doubled my daily output after changing from CS to CB.
  • Public proxies are the devil.
  • OzzOzz
    edited March 2013
    you don't need that much threads when using private proxies. 
    your machine and proxies won't handle 1500 threads anymore with privates. 

    the reason for that is that a thread that is bounded to a dead proxy does exactly nothing more than to wait that the html timer counts down to zero before the thread will be used for another task.

    buy 10 proxies for up to 100 threads and 20 if you want to make sure that everything runs smoothly and see how many threads your machine can manage.

    edit: i should've read the whole thread ;). 30 proxies are more than fine.
  • AlexRAlexR Cape Town
    Are they public or private?
    How many SE's you selected?
    What's SE wait time between searches? 

    All will impact the number of proxies you need!
  • i don't use any proxies :) at least for downloading
    actually i disabled all proxies and still failing

    and tried with even lowering the thread count to 20
    i have 50 mbit fiber connection

    definitely software bug. they are working via firefox
  • still failin even with this settings

    image
  • OzzOzz
    edited March 2013
    maybe something is blocking your IP? check your antivirus and firewall settings.

    edit: isn't one thread not enough? to post your issues in multiple threads doesn't help anyone. furthermore it makes Sven angry ;)
  • @ozz do you really read what i write ?
    firefox and software uses same ip :)

    here example

    19:11:18: [-] 2576/3205 download failed - http://babyshower.tan-desmond.com/page/46/

    it failed but double click to open at browser and tada works
  • OzzOzz
    edited March 2013
    what i meant is if the connection of SER to the internet is blocked by something.
    did you've checked that and read what i've wrote?
  • OzzOzz
    edited March 2013
    what i forgot to ask. did this issue occur today with the latest version or did it occur out of nowwhere? if everything worked yesterday than roll back to previous version to see if its working again.
  • @Ozz it is setup in vmware windows xp and no firewall or antivirus is installed
    i dont know but latest version may be the issue

    but i am 100% sure that it is software bug
    because my own c# 4.5 software validated every one of the urls exists that i have put into campain
  • then roll back to latest version and see if its working again. the .exe to previous version is in your installation folder.
  • edited March 2013
    @Ozz if i setup proxy use for page rank check, does it also use for download those proxies ?

    and can i disable page rank check ?
  • OzzOzz
    edited March 2013
    i don't understand that question to be honest.

    disable your PR filters in your projects options. no PR filter = no PR check.
  • @Ozz i am not talking about pr filter. the software checks at least every submission page rank automatically even if i don't put any pr filter.

    i don't want it check pr at all.
  • yep definitely software problem.

    if i set 20 threads, the number of failed is low. i just set it to 80 and wow so many failed ones.

    but i have 50 mbit fiber and able to fetch results from google with hundreds of threads via my own c# 4.5 softwares.

    here the image that clerly explains everything

    no proxy enabled and time out set to 120 seconds

    and all of these links are verified via my own c# 4.5 software every one of them

    image


  • and when i check those failed ones via firefox it works perfectly on the same machine same ip :)
  • Ok problem is most likely caused by the bug at timeout settings

    even if i set 600 seconds timeout, after 10 seconds it is saying download failed
    and at firefox it works perfectly :) that failed site
  • LeeGLeeG Eating your first bourne
    You cant set a 600 second time out. Max it will go to is 180
  • @LeeG ye but it fails in 10 seconds :)
  • ok i think the reason is windows xp

    downloading windows server 2012 to try there :)
  • @ComputerEngineer Do you have GSA Indexer running in the background? 
  • LOL LOL :D
    tried private proxies and results are amazing, you have to add to GSA big title that all users have to get private proxies xD I was using gsa ser only with public proxies, now imagine how much backlinks do I lost...
  • @Chymmi i fixed that problem via harvesting urls with my own software :)

    @doubleup no i don't have that
  • yes I also bring all my lists (verified, succ.,identified) to gscraper, removed duplicate urls, removed low pr and now I got really fast submission only high PR ..I am happy :D
Sign In or Register to comment.