Curl has this impressive looking feature:
$ man curl
 --max-filesize <bytes>
  Specify the maximum size (in bytes) of a file to download. If the
  file requested is larger than this value, the transfer will not
  start and curl will return with exit code 63.

  NOTE: The file size is not always known prior to download, and for
  such files this option has no effect even if the file transfer ends
  up being larger than this given limit. This concerns both FTP and
  HTTP transfers.*

Anyways, wget could also have --max-filesize. WWWOFFLE could do
something on a per URL basis. We modem users then wouldn't have to
worry about something going hog wild as much.

(*Well, they ought to have another option about what to do if it is not
known: get up to max=0, XXX bytes, or infinity.

Wait... couldn't some track of how many bytes swallowed so far be made
and a stop be put to it if it exceeds... Indeed, wget prints those
progress messages showing it is keeping track, info in header or not.

Indeed, at least we can still see the top part of some whopping .JPG,
etc. Too bad .pdfs are seemingly useless if truncated.)

Reply via email to