Fabio La Farcioli wrote:
i am developing a web app for thousands users (1.000/2.000).

Each user have a 2 table of work...I finally have 2.000 (users) x 2 tables = 4.000 tables!

Postgres support an elevate number of tables??
i have problem of performance ???

We have run databases with over 100,000 tables with no problems.

However, we found that it's not a good idea to have a table-per-user design.  
As you get more users, it is hard to maintain the database.  Most of the time 
there are only a few users active.

So, we create a single large "archive" table, identical to the per-user table except that 
it also has a user-id column.  When a user hasn't logged in for a few hours, a cron process copies 
their tables into the large archive table, and returns their personal tables to a "pool" 
of available tables.

When the user logs back in, a hidden part of the login process gets a table 
from the pool of available tables, assigns it to this user, and copies the 
user's  data from the archive into this personal table.  They are now ready to 
work. This whole process takes just a fraction of a second for most users.

We keep a pool of about 200 tables, which automatically will expand (create 
more tables) if needed, but we've never had more than 200 users active at one 
time.

Craig

--
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to