On Thu, Sep 17, 2009 at 1:31 PM, Bill Moran <wmo...@potentialtech.com> wrote:
> In response to Scott Marlowe <scott.marl...@gmail.com>:
>
>> On Thu, Sep 17, 2009 at 12:56 PM, Alan McKay <alan.mc...@gmail.com> wrote:
>> > Is there any way to limit a query to a certain amount of RAM and / or
>> > certain runtime?
>> >
>> > i.e. automatically kill it if it exceeds either boundary?
>> >
>> > We've finally narrowed down our system crashes and have a smoking gun,
>> > but no way to fix it in the immediate term.  This sort of limit would
>> > really help us.
>>
>> Generally speaking work_mem limits ram used.  What are your
>> non-default postgresql.conf settings?
>
> work_mem limits memory usage _per_sort_.
>
> A big query can easily have many sorts.  Each sort will be limited to
> work_mem memory usage, but the total could be much higher.
>
> The only way I can think is to set a per-process limit in the OS and allow
> the OS to kill a process when it gets out of hand.  Not ideal, though.

True, but with a work_mem of 2M, I can't imagine having enough sorting
going on to need 4G of ram.  (2000 sorts? That's a lot)  I'm betting
the OP was looking at top and misunderstanding what the numbers mean,
which is pretty common really.

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to