On 12/2/05, Joseph Mack NA3T <[EMAIL PROTECTED]> wrote:
> Occassionally I copy several 100G of files to another
> machine for backup. This doesn't put any particular load on
> either machine but other processes crawl, presumably because
> memory is being flushed. Is this the likely explanation? If
> so is there a way to restrict the amount of memory that the
> copy uses?

Possibly, depending on what you are using to do the backup.

You might be able to use the bash built-in command ulimit which can
limit the usage of various resources including memory, but unless the
command/program you're executing handles memory limits gracefully,
it's likely just to fail ungracefully.


--
Rick DeNatale

Visit the Project Mercury Wiki Site
http://www.mercuryspacecraft.com/
--
TriLUG mailing list        : http://www.trilug.org/mailman/listinfo/trilug
TriLUG Organizational FAQ  : http://trilug.org/faq/
TriLUG Member Services FAQ : http://members.trilug.org/services_faq/

Reply via email to