Skip to content

Is there a way to avoid download a page extremely big?

There are very more spamy blog pages with thousands and more comments on it. All of us really don't want to continue comment on that page. Even i don't want to post a article on that domain!
Download this kind of page waste power and time at all.
So any idea how to avoid it?

Comments

  • edited June 2014
    You can set SER to skip if the page has XXX amount of outgoing links on it ;)
    image
  • thanks your reply StoimenIliev.

     I think SER will download that page first, then it could know how many link on it?
    But maybe that page will be skiped downloading in future, because SER may remember?

    Am i right or SER could't? It will still download that page again and again before the software know how many link on it?
  • edited June 2014
    @lianggeren Global Options -> Filter -> Maximum size of a website to download
  • edited June 2014
    Desire_

    I think that option is for limit those 10 malware filter file size only, not for nomal submit?
     
    I hope i was wrong..
  • The option is for any website you're about to load up.
  • ronron SERLists.com
    @Desire_ and @fakenickahl are both correct. I think the default is only 2MB. That is a small size, by the way, and should be perfect for what you are talking about.
  • ok. i'll change that to 1M :)
Sign In or Register to comment.