On Jul 2, 8:03 pm, William Stein <[email protected]> wrote:
> On Sat, Jul 2, 2011 at 7:46 AM, Jason Grout <[email protected]> 
> wrote:
> > I've been doing some experiments with the python resource module and the
> > setrlimit command to set resource limits on processes.  I've found something
> > very interesting: it seems that on OSX (10.6.8), there is no way to have a
> > hard cap on the amount of memory a process uses, at least no way that I can
> > find.
>
> Wow, this is really disturbing.
> Thanks for letting us know about this.
> I tried using "ulimit -v" from bash in OS X, and it also doesn't seem
> to do anything at all.
> Scary.

Indeed.

See also

http://stackoverflow.com/questions/3274385/how-to-limit-memory-of-a-os-x-program-ulimit-v-neither-m-are-working

and

http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man5/launchd.plist.5.html

('Data' key in 'HardResourceLimits').

As I have no MacOS X boxes, I can't tell if this works.


-leif



> > The RLIMIT_AS category for resource limits on OSX is an alias for
> > RLIMIT_RSS (see /usr/include/system/resources.h, and note that RLIMIT_AS
> > does not appear in the setrlimit manpage).  However, on Linux (at least the
> > linux running on boxen), RLIMIT_AS *does* seem to put a hard cap on the
> > amount of memory a process can use, and the process fails if it tries to use
> > more memory than that.
>
> > So, the takeaway: be careful when running public sage notebook servers on
> > OSX, since you can't specify a maximum memory hard cap for a process.
>
> > On the other hand, maybe I'm misunderstanding something.  Does anyone know
> > how to set a hard memory cap on OSX?
>
> > Thanks,
>
> > Jason

-- 
To post to this group, send an email to [email protected]
To unsubscribe from this group, send an email to 
[email protected]
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to