Can anyone point me towards good articles or books that would help a PostgreSQL 
novice (i.e. me) learn the optimal approaches to setting up a DB for analytics?

In this particular case, I need to efficiently analyze approximately 300 
million system log events (i.e. time series data). It's log data, so it's only 
appended to the table, not inserted and is never modified. Only 90 days worth 
of data will be retained, so old records need to be deleted periodically. Query 
performance will only be important for small subsets of the data (e.g. when 
analyzing a week or day's worth of data), the rest of the reports will be run 
in batch mode. There will likely only be one user at a time doing ad-hoc 
queries.

This is a a follow-up to the earlier suggestions that PostgreSQL will handle 
the volumes of data I plan to work with, so I figured I'd give it a shot. 

Rob

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to