El 13/12/2011 17:44, Igor Tandetnik escribió:
On 12/13/2011 11:31 AM, Rafael Garcia Leiva wrote:
The problem is that is veeeeery slooooow. It takes nearly 24 hours to
query 1 year of Forex data in my laptop (and I have to work with 10
years periods). I will spend a couple of days learning about sqlite
optimization.

I suspect these two days would be better spent writing a script that just scans through the records in chronological order, with hand-written code to summarize them the way you need. It would be an order of magnitude faster than anything you could come up with in pure SQL.

Yes, I agree that a hand-written script will be much faster, but I would like to have the final aggregated data loaded into sqlite, since I have to work with that data (perform all kind of queries). The problem is that for a complete analysis I need all the aggregated time frames from 2 minutes to 1440 minutes (1 day), since I have to search for the optimum time frame. For a 10 years EURUSD pair that would mean around 27 millions of records. ¿Can sqlite manage such a big table? On the other side, I don't mind to wait a couple of seconds (and the question that I'm trying to anwer is if that it is possible) for sqlite to aggregate data to the current time frame being analysed since analysis time would be much longer than query time. But if aggregating data in 5 minutes intervals takes 1 day and I cannot reduce that time to order of seconds, I will try to find another solution, for example to have 1440 database files, one for each time frame, but that seems to me a very ugly solution.

Best regards

Rafael
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to