>>>> 2012/07/07 12:10 -0600, Albert Padley >>>>
I have a log file that captures data from various sensors every minute that
we use to draws graphs on our website.

The table looks like this:

CREATE TABLE `log` (
  `id` int(14) NOT NULL auto_increment,
  `VarName` varchar(255) NOT NULL,
  `TimeString` datetime NOT NULL,
  `VarValue` decimal(25,6) NOT NULL,
  `Validity` int(1) NOT NULL,
  `Time_ms` decimal(25,6) NOT NULL,
  PRIMARY KEY  (`id`),
  KEY `timestamp` (`TimeString`),
  KEY `name` (`VarName`),
  KEY `nametimevalue` (`VarName`,`VarValue`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;


My existing query which works just fine for our purposes:

SELECT CONCAT('[', ((UNIX_TIMESTAMP(TimeString)+(".$tz_offset."*3600))) *
1000, ' ,', TRUNCATE(VarValue,0), ']') AS value FROM log WHERE VarName =
'04_Set21_SWOS\_085_1300CI' AND SUBSTR(TimeString,1,10) = CURDATE() ORDER
BY TimeString ASC

The only issue is that VarValue tends to vary every minute and create a
graph that is not as smooth as we would like. Therefore, instead of
retuning VarValue for each minute in the above query, we want to return the
average of the last 5 values for that VarName.

I have been searching around the web all morning and haven't hit on the
proper solution. Help would be much appreciated. 
<<<<<<<<
(It would be neater to write DATE(TimeString) = CURDATE())

I suspect that there is good reason for that, and that that is better done 
afterwards, with "TimeString" and "VarValue" until then separated. In MySQL one 
could in a subquery find the five latest "TimeString"s not greater than a given 
"TimeString" and key off that (if they are not unique this is hopeless), but 
the amount of repetition is enormous. One could do the smoothing within SQL 
with a procedure that reads a cursor, or in a script language on the output. 
("tz_offset" would become an argument to the MySQL procedure.)


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/mysql

Reply via email to