Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Tom Lane
Mark writes: > Am I correct in my understanding that any row that has been modified (i.e. > UPDATE) is in state HEAPTUPLE_INSERT_IN_PROGRESS so it will not be included > in the sample? An update will mark the existing tuple as delete-in-progress and then insert a new tuple (row version) that's in

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Mark
Hi Tom, Thanks for your reply. Am I correct in my understanding that any row that has been modified (i.e. UPDATE) is in state HEAPTUPLE_INSERT_IN_PROGRESS so it will not be included in the sample? I'm going to rework the application so there is less time between the DELETE and the COMMIT so I wi

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Tom Lane
Mark writes: > I have a long running job which deletes all of the common_student table and > then repopulates it. It takes long time to load all the other data and > commit the transaction. I didn't think the delete inside the transaction > would have any effect until it is commited or rolled back

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Mark
Hi Ron, I tried my test_table example below using swapping 'BEGIN' for: => BEGIN TRANSACTION ISOLATION LEVEL SERIALIZABLE; It made no difference to the behaviour. On Wed, 2 Jan 2019 at 15:45 Ron wrote: > Try using SERIALIZABLE transactions instead of the default READ COMMITTED. > > > On 1/2/1

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Ron
Try using SERIALIZABLE transactions instead of the default READ COMMITTED. On 1/2/19 9:28 AM, Mark wrote: Hi Ron, Yes, my process will commit the transaction (or roll it back) eventually. It's the window where one transaction has deleted all the rows (before committing) and an ANALYSE has ran

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Mark
Hi Ron, Yes, my process will commit the transaction (or roll it back) eventually. It's the window where one transaction has deleted all the rows (before committing) and an ANALYSE has ran. The deleted rows won't make it into the sample even though the transaction has NOT been commited. During th

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Ron
But Jeff said "left open indefinitely without either committing or rolling back".  Your process is committing the transaction. On 1/2/19 6:15 AM, Mark wrote: Hi Jeff, Thanks for your help. That is exactly what is happening. I have a long running job which deletes all of the common_student tab

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Mark
Hi Maxim, Thanks for your help. Jeff has pointed me in the right direction here, it seems that the rows=1 is due to a long running transaction which deletes all of common_student for school_id 36 and then repopulates it. I was unaware that the delete inside the transaction would affect the VACUU

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2019-01-02 Thread Mark
Hi Jeff, Thanks for your help. That is exactly what is happening. I have a long running job which deletes all of the common_student table and then repopulates it. It takes long time to load all the other data and commit the transaction. I didn't think the delete inside the transaction would have

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2018-12-23 Thread Maxim Boguk
Hi Mark, It's look very weird. Can you try something like this (check that you have enough disk space for second copy of common_student before): create table test_table AS SELECT * from common_student; Vacuum analyze test_table; explain analyze select * from test_table where school_id = 36; drop

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2018-12-22 Thread Jeff Janes
> > > - Does the analyse output below mean that it only scanned 51538 of 65463 > rows in the table? Is school_id 36 just being missed in the sample? (This > happens when the analyse is repeated ) > Is there a transaction which had deleted all of school_id=36, and then was just left open indefinite

Re: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2018-12-22 Thread Mark
> -- > > ALTER table common_student ALTER COLUMN school_id SET STATISTICS 1; > > ANALYZE common_stundent; > > > > Regards, > > Virendra > > > > *From:* Mark [mailto:mwchamb...@gmail.com] > *Sent:* Friday, December 21, 2018 11:39 AM > *To:* pg

RE: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2018-12-21 Thread Kumar, Virendra
: Friday, December 21, 2018 11:39 AM To: pgsql-general@lists.postgresql.org Subject: Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 1 on PostgreSQL 10.2 Hi All, I'm seeing some very slow queries and it looks like the query planner is deciding to do a 'N

Query planner / Analyse statistics bad estimate rows=1 with maximum statistics 10000 on PostgreSQL 10.2

2018-12-21 Thread Mark
Hi All, I'm seeing some very slow queries and it looks like the query planner is deciding to do a 'Nested Loop Left Join' which is slow. When I SET enable_nestloop=OFF for testing it does a 'Hash Left Join' which is much faster. I think I can see the cause of the problem in the examples below.