Is there a way to avoid download a page extremely big?
There are very more spamy blog pages with thousands and more comments on it. All of us really don't want to continue comment on that page. Even i don't want to post a article on that domain!
Download this kind of page waste power and time at all.
So any idea how to avoid it?
Download this kind of page waste power and time at all.
So any idea how to avoid it?
Comments
I think SER will download that page first, then it could know how many link on it?
But maybe that page will be skiped downloading in future, because SER may remember?
Am i right or SER could't? It will still download that page again and again before the software know how many link on it?
I think that option is for limit those 10 malware filter file size only, not for nomal submit?
I hope i was wrong..