[EMAIL PROTECTED] wrote:
> I'm seeking some advice on implementing a poor man's distributed
> processing system.
> The goal is to use two debian boxes connected by a cross-cable
> (PPP or eth) where one box does video capturing, some processing and
> saving to disk (master, heavy load). The other box would download the
> data and start crunching (slave, light load).
> Now, the capturing box is both cpu and i/o stressed, so I'd like to
> shutdown most of the daemons etc.
Note that networking daemons shouldn't actually consume any CPU time
or physical memory unless they are actually being used. If they are
idle (i.e. waiting for a connection), then they won't be allocated any
CPU time, and their address space will be swapped out.
> Exactly what network services are needed in order for the second box
> to be able to connect ?
That depends upon how you want to retrieve the data (FTP, HTTP, rcp,
rsh, ssh, NFS, SMB, ...).
> Second; is there some programming framework which would make easy for
> the slave box to download files ? (programming libs etc.)
> Or would it be easier to revert to socket programming in c. (done that
> in the past..)
The simplest approach would be to export the files via NFS. That way
you don't need to write any networking code. However, this isn't
necessarily the most efficient approach. It may be more efficient to
simply copy the data (e.g. with rcp).
--
Glynn Clements <[EMAIL PROTECTED]>
-
To unsubscribe from this list: send the line "unsubscribe linux-net" in
the body of a message to [EMAIL PROTECTED]