I think python has some libraries ready for this, but are more targeted to 
"huge data analysis". Google around for that, and if you resort to your own 
code, then share it back, if it's flexible enough a module in contrib for 
everyone could be derived from your code.

On Friday, December 14, 2012 2:42:39 PM UTC+1, Paolo wrote:
>
> Hi Niphlod, thanks for clarifying me this. I will go for the aggregation 
> by hand, which is more portable rather than database view. However, we 
> could provide something like that in the web2py library or somewhere, to 
> speedup the work for future developments.
>
>
>
> 2012/12/14 Niphlod <nip...@gmail.com <javascript:>>
>
>> there are no "native" functions on databases to group by, e.g., a quarter 
>> hour. 
>> For those kind of things, either you aggregate by hand in python or code 
>> a view directly in the database to exploit its functions (depending on the 
>> engine, could be either an impossible or a semi-complicated task :-P) and 
>> access it with migrate=False within web2py.
>>
>>
>> On Friday, December 14, 2012 12:53:43 PM UTC+1, Paolo wrote:
>>>
>>> Dear all, 
>>> I am looking for some advice in how to perform time aggregations, so far 
>>> I have been able to aggregate only by hours or minutes but nothing between, 
>>> for example every 10m, 15m and so on. For example this query count the 
>>> number of fields aggregated by hours:
>>> c = db.record.id.count()
>>> s = db.record.created_on.year() | db.record.created_on.month() | db.
>>> record.created_on.day() | db.record.created_on.hour()
>>> rows = db(db.record.id > 0).select(db.record.created_on**, c, groupby=s)
>>> while in this case:
>>> c = db.record.id.count()
>>> s = db.record.created_on.year() | db.record.created_on.month() | db.
>>> record.created_on.day() | db.record.created_on.hour() | db.record.
>>> created_on.minutes()
>>> rows = db(db.record.id > 0).select(db.record.created_on**, c, groupby=s)
>>> How I can make the above queries more flexible? Namely, in aggregating 
>>> for different time frames, i.e., 10minutes, 400seconds and so?
>>> So far, I've solved this issue by making a simple query but aggregation 
>>> and then, counting and aggregating by hand. I hope there is a better and 
>>> linear approach for that.
>>>
>>> Regards
>>> Paolo
>>>
>>>  
>>>
>>  -- 
>>  
>>  
>>  
>>
>
>
>
> -- 
>  Paolo
>  

-- 



Reply via email to