On 07/21/2009 04:22 PM, John Doe wrote:
> From: Sean Carolan <scaro...@gmail.com>
>>> While having hard limits makes it safer, wouldn't it be better to control 
>>> the 
>>> memory usage of the script instead of setting limits that would trigger an 
>>> "out 
>>> of memory"...?
>> How would you control the memory usage of the script if it's run by
>> the root user?
> 
> By control I meant to design the script to use a specific amount of RAM, 
> instead of letting it vampirise all available memory...

But what if the program's memory use is dependent on lots of factors
which are not easily predictable.
And you want to avoid bringing the whole system to it's knees while swapping
and killing arbritrary other programs while one program is consuming all
of ram and swap.
In that case it's easier to limit the memory of that program to e.g. 1 GByte 
RAM,
in which normal input usually can be processed without any trouble.  And then,
when someone feeds the program some bad data which uses exponentially more 
memory,
then it gracefully stops, giving a clear error message that this input results 
in
too much memory use.

Lots of scenario's for a valid use of such a limit exist.
_______________________________________________
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos

Reply via email to