Here's the problem, I request
http://blog.linux.org.tw/~damon/archives/000457.html
which turns out to be 1208353 bytes of spam.

I can't be there to personally review every web page while it is
downloading to hit the browsers stop key.  A browser might have never
been involved even.

I want to say for URLs matching *blog*, cut off the page after say
200000 bytes, or even earlier if the headers WWWOFFLE sees say the
size is big.

Certainly WWWOFFLE is the right place for adding such limiting
functionality. Where else would be more appropriate in the web chain
on the users end for it to be added? Somebody please add it.

Reply via email to