On Fri, Feb 04, 2005 at 05:59:26 -0800,
  Stephan Szabo <[EMAIL PROTECTED]> wrote:
> On Fri, 4 Feb 2005, Eric Jain wrote:
> 
> > I'm trying to fill a table with several million rows that are obtained
> > directly from a complex query.
> >
> > For whatever reason, Postgres at one point starts using several
> > gigabytes of memory, which eventually slows down the system until it no
> > longer responds.


> > Any ideas? Is this a known problem, or should Postgres be able to handle
> > this? May be tricky to reproduce the problem, as a lot of data is
> > required, but I can post the DDL/DML statements I am using if this helps.
> 
> Explain output would also be useful.  I would wonder if it's a problem
> with a hash that misestimated the necessary size; you might see if
> analyzing the tables involved changes its behavior.

I think deferred triggers can also use a lot of memory.

---------------------------(end of broadcast)---------------------------
TIP 7: don't forget to increase your free space map settings

Reply via email to