Tom Lane wrote:
Michal Taborsky <[EMAIL PROTECTED]> writes:

Peter Eisentraut wrote:

Is there any practical limit on the number of parallel connections that a PostgreSQL server can service? We're in the process of setting up a system that will require up to 10000 connections open in parallel. The query load is not the problem, but we're wondering about the number of connections. Does anyone have experience with these kinds of numbers?


No experience, but a little thinking and elementary school math tells me, that you'd need huge amount of RAM to support 10000 connections, since postgres is multi-process. Our typical postgres process eats 5-40 megs of memory, depending on activity. So even if it was just 5 megs, with 10k connections we are talking about 50G of RAM. If these connections are idle, it would be plain waste of resources.


5-40 megs sounds high, unless you run very complex queries.  I wonder
whether you aren't counting Postgres shared memory in that "per process"
figure.  (Most implementations of "top" are not very good about
distinguishing shared and private memory, FWIW.)

But even estimating just a meg or two of private space apiece, the total
is daunting.

I did last week an Ariadne+Postgresql valutation for the company where I work and I learned that with 250 MB you can open up to 80 concurrent query with 500 MB you can open up to 120 concurrent query from now on for each 250MB you can have ~40 connections more

if you break these rules that machine trash...

Peter for 10000 connections need then 61 GB that is quite amazing :-)


Regards Gaetano Mendola

---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?

http://www.postgresql.org/docs/faqs/FAQ.html

Reply via email to