Thank you so much for you suggestion, it is probably a better way to normalize 
the data to a policy data an using multiple tables.


The millions of table is not true (but there are around 60 database instances), 
but the hundreds of lines of query are the actual queries provided by current 
db team (actually, all queries from them are more than 200 lines).


I may try both of them since I am half way around my original plan. 


To link everything to policy number is my original attempt, the reason I give 
up and apply array is due to the historical transaction data, status update and 
multiple policies in one contract. But surly, by summarizing and reformatting 
the current structure, it will reduce significant number of tables and make it 
relatively easy.


Thank you again for you advice!!


shore




------------------ Original message ------------------
From: "David G. Johnston"; 
Sendtime: Wednesday, May 23, 2018 10:29 PM
To: "a"<372660...@qq.com>; 
Cc: "Charles Clavadetscher"; "pgsql-general"; 
Subject: Re: RE: RE: How do I select composite array element that 
satisfyspecific conditions.



On Wed, May 23, 2018 at 6:50 AM, a <372660...@qq.com> wrote:



That is only by saying, the actual information could be much more, and all of 
them are not in some way, "aligned". 


?Not sure what you are getting at here - "related" is generally the better term 
and usually during modeling one of the tasks is to identify those relationships 
even if they seem to be obscure.?  In this case most everything is likely 
related by policy number one way or another.

 
The results would be millions(which means many) of tables lies in database and 
each query is hundreds of lines. It is hard to create new query that target 
your info and it is dangerous to modify any set query.


?I seriously doubt you'd end up with millions of tables...and hundred line 
queries are likely going to happen in spite of your attempts to simplify.  In 
fact I'd say the number of "complex" lines will end up being higher - most of 
the lines in a normal query against a normalized database are verbose but 
simple.


Now my think was to group data into structures so that I can significantly 
decrease the amount of table, and since it can hold array, I can actually put 
historical data into one table for one year, which stops query from multiple 
historical tables and shrink the size of database.

?An array of composites is a table - your just making things difficult by not 
actually creating one up front.?



However, I am new to this and do not have experience, so if you could provide 
any suggestion, it would be extremely grateful from me.


?The scope of this database seems to be a bit much for one's first attempt at 
doing something like this...?


I'd recommend learning and then applying as much technical normalization as you 
can to your model and assume that years of modelling theory is going to be a 
better guide than inexperienced gut instinct.  Starting from a normalized 
position you can selectively de-normalize and add abstraction layers later when 
you come across actual problems that you wish to solve.


David J.

Reply via email to