I am storing electricity consumption data to the sqlite. The simple table to store kWh consumption looks like following example (accumulated total readings in each row - exactly as you see on your electricity meter):
|ID|timestamp |kWh ||1 | 2019-07-31 14:24:25 | 270.8||2 | 2019-07-31 14:31:02 | 272.1||3 | 2019-08-01 06:56:45 | 382.5||4 | 2019-08-01 16:29:01 | 382.5||5 | 2019-08-01 20:32:53 | 582.5||6 | 2019-08-02 16:18:14 | 612.1| |7 | 2019-08-08 07:13:04 | 802.7| |..|.....................|......| - The data interval is not predictable (is random). - There can be a day with no records at all (if data transmission failure for example). - There can be many records with the identical (equal) power consumption (no energy consumption) for one or more days. My question is how to write SQL select to get energy consumption for required interval summarized by days, weeks or months ... The real challenge is to get an average if for each day for days when records were not taken (in the example table days between ID 6 and ID7) - each day as a row. It looks like simple question but I am pulling out my hair for two days to find a solution. _______________________________________________ sqlite-users mailing list sqlite-users@mailinglists.sqlite.org http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users