8GB is workable. Make sure you use prepared statements to avoid recompiling you
insert 500 million times. Also with this much data, it would probably be a very
good idea to compile SQLite with a much larger memory cache. Don't expect a
miracle either. 500 million is a very large number, any way
John, thank you for the comments !
Maybe I wasn't clear - the 10TB data is separated. It contains a lot of other
data that I don't dream of storing in a database. But this bulk data is
structured in fixed-length records, each record containing a vector of floating
point values and some
I doubt SQLite is the right tool for this job, for a number of reasons.
First, if the data is as simple as you say, you are probably better off writing
your logic as straight C, rather than SQL. SQLite is VERY fast, but there is
still an incredible amount of overhead in executing a query, in
Sorry for the formatting - it looked better when I sent it from Yahoo's web
interface.
- Original Message
From: Igor Conom
To: sqlite-users@sqlite.org
Sent: Sat, October 17, 2009 9:03:54 AM
Subject: [sqlite] Creating a spatial index for a large number of points
4 matches
Mail list logo