> When you dig into it, most sites have a lot of data that can be out of sync
> for some period.

Agreed. We run an accounting application which just happens to be
delivered via the web.  This definitely colors (distorts?) my view.

> heavy SQL.  Some people would say to denormalize the database at that point,
> but that's really just another form of caching.

Absolutely.  Denormalization is the root of all evil. ;-)

> No need to do that yourself.  Just use DBIx::Profile to find the hairy
> queries.

History.  Also, another good trick is to make sure your select
statements are as similar as possible.  It is often better to bundle a
couple of similar queries into a single one.  The query compiler
caches queries.

> Ironically, I am quoted in Philip Greenspun's book on web publishing saying
> just what you are saying: that databases should be fast enough without
> middle-tier caching.  Sadly, sometimes they just aren't.

Every system design decision often has an equally valid converse.
The art is knowing when to buy and when to sell.  And Greenspun's book
is a great resource btw.

Rob

Reply via email to