On Jun 19, 2008, at 10:26 AM, Alvaro Herrera wrote:
David Miller wrote:

That is fine.. Maybe a dynamic configurable parameter that can be
set/updated while the database is running.

If it were a parameter, it could not be changed while the database is
running.

This issue lies in the fact that we have queries larger than 1K and we would like to be able to capture the entire query from Postgres Studio
without having to process the log files..

Have you considered using CSV logs instead?  Should be easier to
process.

Would it be hard to have a backend write it's complete command out to a file if the command lasts more than X number of seconds, and then allow other backends to read it from there? It is extremely annoying to not be able to get the full query contents.

Also, I don't necessarily buy that 32k * max_connections is too much shared memory; even with max_connections of 1000 that's only 32M, which is trivial for any box that's actually configured for 1000 connections.
--
Decibel!, aka Jim C. Nasby, Database Architect  [EMAIL PROTECTED]
Give your computer some brain candy! www.distributed.net Team #1828


Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to