Hi,
I was looking for opinions on performance for a design involving schemas.  We 
have a 3-tier system with a lot of hand-written SQL in our Java-based server, 
but we want to start limiting the data that different users can access based on 
certain user properties.  Rather than update hundreds of queries throughout our 
server code based on these user properties we were thinking that instead we 
would do the following:

1. Build a schema for each user.
2. Reset the users search path for each database connection so it accesses 
their schema first, then the public schema
3. Inside that users schema create about 5 views to "replace" tables in the 
public schema with the same name.  Each of these views would provide only a 
subset of the data for each corresponding table in the public schema based on 
the users properties.
4. Provide rules for each of these views so they would act as 
insertable/updateable/deleteable views. 

Does anyone have any thoughts on how this may perform over the long-haul?  
Database cleanup or maintenance problems?

We currently only handle about 50 users at a time, but expect it to potentially 
handle about 150-200 users within a year or two.

Running PostgreSQL 8.2.4 on x86_64-unknown-linux-gnu, compiled by GCC gcc (GCC) 
3.4.3

Thanks!
 
 
--------------------------------------------------------

Information in this e-mail may be confidential. It is intended only for the 
addressee(s) identified above. If you are not the addressee(s), or an employee 
or agent of the addressee(s), please note that any dissemination, distribution, 
or copying of this communication is strictly prohibited. If you have received 
this e-mail in error, please notify the sender of the error.

---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

Reply via email to