Olexandr, I fear that model may not scale well.  Certainly having data
spread across so many different tables reduces the effectiveness of memory
caching.

I recommend you alter the design to have 30 tables total, each with a column
for user id, rather than 30 tables per user.  This may improve performance
by allowing memory caching to be more effective.  If you index the user id
field performance should remain good even though the tables will be larger.

Dan


On 2/9/07, Olexandr Melnyk <[EMAIL PROTECTED]> wrote:

Hello,

I am working on hosting solution for a popular open source script, where
users can sign up and get an own forum. We use InnoDB, and each user has
an
own set of tables (around 30); there are currently around 500 users
registered. Despite the large count of tables, database is small, around
50
MB.

Server is dual Xeon with 4GB of RAM and MySQL uses all memory, available
on
the server, so there's even nothing left for Apache.

As the user base is growing, what steps would you suggest me to do, in
order
to keep good database performance and make it more efficient on RAM usage?
The former is more important.

------------------------
Olexandr Melnyk,
http://omelnyk.net/

Reply via email to