Gus Wirth wrote:
Ralph Shumaker wrote:
About 10 years ago, when I was on w95, I had a program called FileHound. It was the best downloader program I have ever used. It had many switches you could configure to suit things more to your liking. I set it to be *extremely* agressive with the bandwidth whenever the program was not minimized, but to be very meager with the bandwidth when minimized. If memory serves me, I was able to just point it to a file on ftp://someSite.com/pathTo/file or http://someSite.com/pathTo/file and it would figure out the rest. It wouldn't bother me beyond that. And it was very aggressive. If it failed one way, it would try another. If a download broke, it would automatically restart it (or in most cases, would actually *resume* it).

I have never since seen a program that convenient *OR* that aggressive. After 10 years, I would think that such things should become more prevalent.

Why do the best things seem to devolve, or go extinct, or just not catch on?

I miss that FileHound program. I wish the author would port it to Linux.

Anyone know of anything even close to it on Linux?

wget

wget is a command line program that does most of what you have described, except the automatic throttling part. That can be handled with nice. Use it like so:

wget http://someSite.com/pathTo/file

So, are you saying that nice doesn't just make things be nicer about CPU usage, but modem usage also? That would be great. It would be nice if downloads would soak the connection _*except*_ when *anything-else* tried to use the modem.



See the man page for further info. wget can be used to grab entire web sites if desired.

Could I get yum to use wget instead of whatever it is that it currently uses? (yum gives up too easily. Whenever I got desperate, and yum was choking on rpmforge's header file download, I would use Konqueror to download it (resuming the download after it died every 12%), chown to root, and mv the file to the proper place. Then yum was happy. But doing it like that is a PITA.) (Konqueror resumed the download of the header file. Firefox always restarted it. gFTP is too much of a PITA to use. I don't recall trying anything else.) (I suppose I could have started Konqueror as root, and then downloaded the header file directly to its proper place. But even that would have been more of a PITA than it should be. yum should try to resume downloading header files. Thru Konqueror, I had to resume 9 times (every 12%), but it got it.)


--
Ralph

--------------------
Introducing ambiguity is bad.
--Stewart Stremler

Give me ambiguity, or give me something else!
--kelsey hudson

--
KPLUG-Newbie@kernel-panic.org
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-newbie

Reply via email to