On Tue, Aug 12, 2008 at 16:57, Ralph Shumaker <[EMAIL PROTECTED]> wrote:
> Way back when I was running whendoze (around 10 years ago), I had a > shareware program called File Hound. One of the many features that I liked > about that program is that if I had it running while browsing the internet, > and I right click on something and choose "copy link location", File Hound > would take it from there and fetch the object of that link, in the > background, while I continued surfing. It even cached subsequent links and > got to them as soon as it could. > > I've been trying to think of how I might be able to do something like that > in linux. wget is just as tenacious as File Hound for files whose download > keeps breaking. I think there's a way to tell wget not to bother fetching > duplicates. What I'm not sure about is how to get wget to run like a > daemon, and listen for copied links. Any ideas? > > Sure seems a lot of over thinking is going on in this thread. KDE has kget in the kdenetwork package, and there is gwget for gnome at http://www.gnome.org/projects/gwget/. They both handle scenarios like this. Nice little gui apps that run in your notification area. -- [email protected] http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list
