Hello.

 

From:

http://www.pgpool.net/

pgpool-II also has a limit on the maximum number of connections, but extra 
connections will be queued instead of returning an error immediately.

 

But your configuration does not look optimal for me. Here are some things you 
may try:

1)      Get rid of indexes. Use this table as OLTP, then denormalize data and 
load it to OLAP table, build indecies and analyze it.

2)      Find bottleneck using your OS tools (is it I/O or CPU?) and improve 
appropriate subsystem)

3)      Use several servers (multimaster configuration like 
https://wiki.postgresql.org/wiki/Bucardo) 

 

Ilya Kazakevich

 

JetBrains

 <http://www.jetbrains.com/> http://www.jetbrains.com

The Drive to Develop

 

From: pgsql-general-ow...@postgresql.org 
[mailto:pgsql-general-ow...@postgresql.org] On Behalf Of Edmundo Robles
Sent: Monday, August 15, 2016 11:30 PM
To: pgsql-general@postgresql.org
Subject: [GENERAL] Looking for software to 'enqueue' connections

 

Hi!  

I want find  a software to 'enqueue' the client connections to database, so if 
i reach the max limit the query must be holding in a queue   until one 
connection is released.

 

I have  many devices (100+) saving  their state to a database,  each minute,  
but  the table is too large more than 13,000,000 of records and many indexes, 
so, insert  one record takes 3 or more minutes.

 

Then,  there is a moment  at connection limit is reached :( and  lose 
information  

 

I tried with pgbouncer  to  'enqueue' the connections but  I  get  no success, 
maybe   I missing something...

 

by the way: 

I use postgres 9.4 with max_connections 100

and pgbouncer  max_connections to 100 and  reserve_pool_size=50

 

I hope you  can help me... 

 

 thanks.

 

Reply via email to