Wednesday, May 3, 2006, 06:42:34, E Tse wrote:

> I am planing to use sqlite for a project involving huge data. The planned
> sqlite file (which will contain a single table) will grow to more than 4g in
> size. Has anyone try that before and what problems have you encountered?

Be prepared, that your planned design might become subject of change.
I've a 1 GB file here, initially populated mainly by one big table
containing 25 millon records (3 floats, 5 integers). For *my* problem
splitting the table in 200 smaller ones caused a big performance gain.
All things are relevant in the case of big tables. File system issues
(as mentioned), but also index creating (when, if any; for which
entities), statement optimizing (the requests themselves as well as the
handling of possible huge answer sets from SELECT etc.).
Without some knowledge regarding your problem domain I'm afraid no
reasonable analysis is possible besides very general remarks. Caring for
the latter ones is not difficult, but the solution of your problem lies
most of the time in finding a suiting structure.
That said, also my splitting was in no way arbitrary, it followed from
certain conditions of the problem behind.

Micha  
-- 

Reply via email to