"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> When I first connect to the database via "psql" and issue the "create
> index" statement, of course, I get the "out of memory" error. If I
> don't quit my current session and re-ran the same DDL statement again,
> the index gets created successfully!..
thought it may be an interesting observation
to think about.
Husam Tomeh
-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Tomeh,
Husam
Sent: Thursday, February 23, 2006 11:57 AM
To: Tom Lane
Cc: pgsql-performance@postgresql.org
Subject: Re: [PERFO
PERFORM] 0ut of Memory Error during Vacuum Analyze and
Create Index
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> mtrac=# show maintenance_work_mem ;
> maintenance_work_mem
> --
> 1048576<==
> (1 row)
> mtrac=#
> mtrac
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> mtrac=# show maintenance_work_mem ;
> maintenance_work_mem
> --
> 1048576<==
> (1 row)
> mtrac=#
> mtrac=#
> mtrac=# create index mort_ht on mortgage(county_id,mtg_rec_dt);
> ERROR: out of memory
e [mailto:[EMAIL PROTECTED]
Sent: Tuesday, February 14, 2006 2:16 PM
To: Tomeh, Husam
Cc: pgsql-performance@postgresql.org
Subject: Re: [PERFORM] 0ut of Memory Error during Vacuum Analyze and
Create Index
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> I have run pg_dump and had no
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> I have run pg_dump and had no errors. I also got this error when
> creating one index but not another. When I lowered my
> maintenance_work_mem, the create index succeeded.
Create index too? Hm. That begins to sound more like a memory leak.
Do you ha
, 2006 9:51 AM
To: Tomeh, Husam
Cc: pgsql-performance@postgresql.org
Subject: Re: [PERFORM] 0ut of Memory Error during Vacuum Analyze
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> =# vacuum analyze code;
> ERROR: out of memory
> DETAIL: Failed on request of size 1073741820
"Tomeh, Husam" <[EMAIL PROTECTED]> writes:
> =# vacuum analyze code;
> ERROR: out of memory
> DETAIL: Failed on request of size 1073741820.
That looks a whole lot like a corrupt-data issue. The apparent
dependency on maintenance_work_mem is probably illusory --- I suspect
some of your trials ar
This is the second time I'm getting out of memory error when I start a
database vacuum or try to vacuum any table. Note this machine has been
used for data load batch purposes.
=# vacuum analyze code;
ERROR: out of memory
DETAIL: Failed on request of size 1073741820.
I'm running Postgres 8.1