Way back when I was running whendoze (around 10 years ago), I had a
shareware program called File Hound. One of the many features that I
liked about that program is that if I had it running while browsing the
internet, and I right click on something and choose "copy link
location", File Hound would take it from there and fetch the object of
that link, in the background, while I continued surfing. It even cached
subsequent links and got to them as soon as it could.
I've been trying to think of how I might be able to do something like
that in linux. wget is just as tenacious as File Hound for files whose
download keeps breaking. I think there's a way to tell wget not to
bother fetching duplicates. What I'm not sure about is how to get wget
to run like a daemon, and listen for copied links. Any ideas?
--
Ice, ice, everywhere, as far as the eye can see,
And nowhere can be found a drop of gin for me.
--Stewart Stremler
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list