James Mansion wrote:
I can't see how an OS can lie to processes about memory being allocated to them and not be ridiculed as a toy, but there you go. I don't think Linux is the only perpetrator - doesn't AIX do this too?

This is a leftover from the days of massive physical modeling (chemistry, 
physics, astronomy, ...) programs written in FORTRAN. Since FORTRAN didn't have 
pointers, scientists would allocate massive three-dimensional arrays, and their 
code might only access a tiny fraction of the memory.  The operating-system 
vendors, particularly SGI, added features to the various flavors of UNIX, 
including the ability to overcommit memory, to support these FORTRAN programs, 
which at the time were some of the most important applications driving computer 
science and computer architectures of workstation-class computers.

When these workstation-class computers evolved enough to rival mainframes, 
companies started shifting apps like Oracle onto the cheaper workstation-class 
computers.  Unfortunately, the legacy of the days of these FORTRAN programs is 
still with us, and every few years we have to go through this discussion again.

Disable overcommitted memory.  There is NO REASON to use it on any modern 
server-class computer, and MANY REASONS WHY IT IS A BAD IDEA.

Craig

--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to