On 1/16/15 11:00 AM, Pavel Stehule wrote:
Hi all,

some time ago, I proposed a lock time measurement related to query. A main 
issue was a method, how to show this information. Today proposal is little bit 
simpler, but still useful. We can show a total lock time per database in 
pg_stat_database statistics. High number can be signal about lock issues.

Would this not use the existing stats mechanisms? If so, couldn't we do this per table? 
(I realize that won't handle all cases; we'd still need a "lock_time_other" 
somewhere).

Also, what do you mean by 'lock'? Heavyweight? We already have some visibility 
there. What I wish we had was some way to know if we're spending a lot of time 
in a particular non-heavy lock. Actually measuring time probably wouldn't make 
sense but we might be able to count how often we fail initial acquisition or 
something.
--
Jim Nasby, Data Architect, Blue Treble Consulting
Data in Trouble? Get it in Treble! http://BlueTreble.com


--
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to