Monday, June 19, 2006, 07:37:22, Manzoor Ilahi Tamimy wrote:

> The Database Size is  more than 500 MB.
> It contain one table and about 10 million Records.

I had problems with even more records (roughly 25 million, > 1GB of
data) and I've stopped efforts to do it in pure sqlite in the end, also
because datasets with even more data (10 GB) are foreseeable.
Anyway, the problem has lead to another solution. In _my case_ the bulky
data are relatively simple and access to them required not anything, SQL
has to offer. So, hdf5 (http://hdf.ncsa.uiuc.edu/HDF5/) for the mass
data + sqlite for the more sophisticated (but much smaller) tables play pretty
well together. E.g, the hdf5 library was able to write a complete 1.2 GB
file in 25 s - and file I/O becomes a bottleneck for sqlite then.
But when analyzing your problem have in mind, hdf5 has other
limitations. Inserts and even appends are not easily achieved in hdf5.
Even so, not every read operation. It's still great, when having
multidimensional data in a sense, scientific communities use them.
You can select so called hyperslabs from these fields very, very
quickly.

Micha  
-- 

Reply via email to