On 12/30/15 1:31 PM, Joe Conway wrote:
On 12/30/2015 11:09 AM, Cory Tucker wrote:
With this scenario you can expect an autoanalyze every 5 million rows
and autovacuum every 10 million. In my experience (and based on your
description, yours as well) this is not often enough. Not only that,
when
Cory Tucker writes:
> This table is almost always queried using a combination of (account_id,
> record_id) and is generally pretty fast. However, under certain loads, the
> query becomes slower and slower as time goes on. The workload that causes
> this to happen is when
On Wed, Dec 30, 2015 at 11:20 AM Tom Lane wrote:
> Cory Tucker writes:
> > This table is almost always queried using a combination of (account_id,
> > record_id) and is generally pretty fast. However, under certain loads,
> the
> > query becomes
We have a performance problem accessing one of our tables, I think because
the statistics are out of date. The table is fairly large, on the order of
100M rows or so.
The general structure of the table is as follows:
Column | Type | Modifiers
On 12/30/2015 11:09 AM, Cory Tucker wrote:
> We have a performance problem accessing one of our tables, I think
> because the statistics are out of date. The table is fairly large, on
> the order of 100M rows or so.
> The fix I have employed to restore the speed of the query after I notice
> it