Hi,

we are currently running a postgres server (upgraded to 8.1) which has one large database with approx. 15,000 tables. Unfortunately performance suffers from that, because the internal tables (especially that which holds the attribute info) get too large.

(We NEED that many tables, please don't recommend to reduce them)

Logically these tables could be grouped into 500 databases. My question is:

Would performance be better if I had 500 databases (on one postgres server instance) which each contain 30 tables, or is it better to have one large database with 15,000 tables? In the old days of postgres 6.5 we tried that, but performance was horrible with many databases ...

BTW: I searched the mailing list, but found nothing on the subject - and there also isn't any information in the documentation about the effects of the number of databases, tables or attributes on the performance.

Now, what do you say? Thanks in advance for any comment!

Mike

---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
      choose an index scan if your joining column's datatypes do not
      match

Reply via email to