> I am writing an application that will poll  up to 500 devices every 5
> minutes.  For this application I need to keep at least 6 months of data.  If
> I were to use a single record for every update, I would have ~25 million
> records.  Obviously this is not an ideal solution.
> 
> I am interested in getting ideas on how to tackle this.  I was thinking that
> I would have a table that stored all the values for the current day and a
> table for historical data.  After 24 hours the data would be averaged into a
> daily snapshot and stored in another table.  The data from that day would
> then be cleared.  This would only require ~90k records.
> 
> I would appreciate any ideas.  I have written a ton of web applications
> using mySQL, but have never had to scale above 100k entries :)

I've got a similar problem (tracking logins with records kept for a long
time), and my solution is a bit messy, but works. For some values of
"works" anyway. :-)

2 databases, one for current data, another for historical.

One table per day with the date included in the table name. Just after
midnight create empty tables for the next couple of days and move the
oldest table(s) into the historical database, merging the records into
a monthly table on the way. I keep 60 days of data in the current database, 
and all the rest in historical.

Historical queries tend to run a bit slower, but that's life, and people
making such requests are told this up front.

Hope this helps a little.


The Hooker
--
"If you can't copyright mistakes, how are Microsoft going to protect
Windows from pirates?'

---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to