Hi all,
I am planing to use sqlite for a project involving huge data. The planned
sqlite file (which will contain a single table) will grow to more than 4g in
size. Has anyone try that before and what problems have you encountered?

I heard that the initial disk caching time might be a problem, and to solve
it we can open the file as a normal binary file and read the file in chunks.
can I just do incremental file seeks (instead of actually reading it)?

I wish sqlite can be enhanced to represent itself as a collection of files,
for example,

my_database_1
my_database_2
...

etc

Thanks,
Eric

Reply via email to