I did a Computer Science MSc 30 years ago specialising in databases (the
relational model was only in prototypes). Of course normalisation was well
known, but what people would say is normalising is the easy part; the skill
comes in 'collapsing'. More recently the term 'denormalise' has been used
instead. This is where you repeat foreign data in a table to avoid the
overhead of joins at runtime. 

Over the intervening years I can't ever remember denormalising data (even
when dealing with eg 13 million insurance customers in a table). Is it OK
nowadays to say always aim to be fully normalised - modern RDBMSs are
usually powerful enough to cope with most anything?
-- 
View this message in context: 
http://www.nabble.com/Denormalisation-tp24688494p24688494.html
Sent from the SQLite mailing list archive at Nabble.com.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to