Jeremy,
>Greetings...
>
>I do something similar, pulling across 400+ log files, 3GB+ total size
every
>night. However, I am using considerably fewer drones than you. We found
>that we reached a saturation point long before that. Disk I/O and network
>I/O both became bottlenecks before we re
2001 9:30 AM
To: [EMAIL PROTECTED]
Subject: RE: [Perl-unix-users] memory usage of perl processes
>I guess you could use 'top' in a unix window.
>
>Then kick off 1 drone only and look at the memory usage. Then * it by the
>number of processes you expect.
>55 is a lot of proce
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED]]On Behalf Of
> [EMAIL PROTECTED]
> Sent: Wednesday 21 March 2001 16:30
> To: [EMAIL PROTECTED]
> Subject: RE: [Perl-unix-users] memory usage of perl processes
>
>
>
>
>
>
>I guess you could use 'top' in a unix window.
>
>Then kick off 1 drone only and look at the memory usage. Then * it by the
>number of processes you expect.
>55 is a lot of processes On Any unix system.
>
>Perhaps you could stagger the number of processes. so maybe spawn 30
drones
>and have
Wednesday 21 March 2001 16:00
> To: [EMAIL PROTECTED]
> Subject: [Perl-unix-users] memory usage of perl processes
>
>
> I'm sure this comes up frequently, so I apoligize...
>
> I am designing a system to process almost 4000 remote sites in a nightly
> sweep. This process
I'm sure this comes up frequently, so I apoligize...
I am designing a system to process almost 4000 remote sites in a nightly
sweep. This process is controlled from a database which maintains site
status in realtime (or at least that's the goal). I am attempting to fork
off around 100 "drones"