Christian Fraenkel wrote:

> Running wget on the local host as a daemon doesn't make much sense,
> I agree. I had bigger networks in mind when writing this. Instead of running
> multiple wgets on different machines, or even as different users on the
> same machine, I would like to give my users the possibillity to yust append
> their downloads to the central download queue of the sole wget daemon.
> (via a client that accepts wget syntax / gui interface). This way the server
> could do central bandwidth throttling( if that gets ever implemented in a
> portable way). Also the download would be faster for the individual user.
> (given wget queues in a intelligent way e.g. give smaler files a priority boost).

I had once implemented a "queued wget", but not as a deamon. I used two methods
(suitable for M$ (actually any OS) - U*X heterogen nets):
1. you could send a url (and options) to a virtual user "download" on my machine.
2. a shared drive, where you can drop links to (e.g. from netscapes' "location" tag).

in both cases, batch jobs do the real work and mail back when finished.only one wget
at a time. with the mail, you could also include a priority ("now" or "during
night"), which was especially handy with our quite low bandwidth to the www.

i think i've still got the scripts, if you are interested.

regards
Hardy

------------------
Linux is like a wigwam:
No Windows, no Gates
and an Apache inside.

begin:vcard 
n:Steffin;Hartmut
tel;fax:+49 (0)40 23 78 13-40
tel;work:+49 (0)40 23 78 13-36
x-mozilla-html:TRUE
url:http://www.pecos.de
org:Pecos AG
adr:;;Musilweg 2;Hamburg;;21079;Germany
version:2.1
email;internet:[EMAIL PROTECTED]
fn:Hartmut Steffin
end:vcard

S/MIME Cryptographic Signature

Reply via email to