On Tue, Dec 29, 2009 at 4:09 PM, Alvaro Herrera
wrote:
> It's expecting 85k distinct groups. If that's not accurate, then
> HashAggregate would use more memory than expected.
Great diagnosis. There are actually about 76 million distinct groups.
> See if you can make it work by setting enable
Alvaro Herrera writes:
> It's expecting 85k distinct groups. If that's not accurate, then
> HashAggregate would use more memory than expected. See if you can make
> it work by setting enable_hashagg = off.
> If that works, good -- the real solution is different. Maybe you need
> to ANALYZE mor
Anthony wrote:
> On Tue, Dec 29, 2009 at 3:41 PM, Anthony wrote:
>
> > I'm running a group by query on a table with over a billion rows and my
> > memory usage is seemingly growing without bounds. Eventually the mem usage
> > exceeds my physical memory and everything starts swapping.
> >
>
> I
On Tue, Dec 29, 2009 at 3:41 PM, Anthony wrote:
> I'm running a group by query on a table with over a billion rows and my
> memory usage is seemingly growing without bounds. Eventually the mem usage
> exceeds my physical memory and everything starts swapping.
>
I guess I didn't ask my question.
Hi all,
I'm running a group by query on a table with over a billion rows and my
memory usage is seemingly growing without bounds. Eventually the mem usage
exceeds my physical memory and everything starts swapping. Here is what I
gather to be the relevant info:
My machine has 768 megs of ram.
s