Offline, I saw a link that said 'see if your site is blacklisted in
China', http://asp-cyber.law.harvard.edu/filtering/list.html, so I
clicked it for fetching the next day.

What a wastefully formatted 2.4 MB repetitive file that costed me
several coins of modem time before I knew what was slowing down all my
other minor downloads.

I wish wwwoffle would have an "approval needed" directory, where when
an incoming Content-Length is over a user set limit, the URL is sent
to a special /var/cache/wwwoffle/confirm directory where we can learn
why each item was held up, and click to approve. Maybe even it could
optionally also get the first bit of a file so we know what we're in
for.

I mean let's say you see http://nurd.org/tinyfile.txt and click it for
downloading next Monday.  What are the chances that you'll catch that
it is indeed a 15 billion byte file, before spending more modem time
than you planned?

Must we, upon connecting, do a
wwwoffle-ls outgoing|awk '{print $NF}'|
xargs wget --spider -Y off -t 1 2>&1|
awk '/^(--|Length|..:)' #[can refine]
Hmm, wait, maybe though wastefully connecting twice, perhaps is a neat
personal early warning system...

J> http://ziproxy.sourceforge.net/

another layer of complexity just when I'm running out of cells upstairs

Reply via email to