On 9 Aug 2011, at 2:27pm, Jaco Breitenbach wrote:

> The problem that I'm facing, is that I would ultimately need to process
> 1,000,000,000 records a day, with history to be kept for up to 128 days.  I
> am currently creating a new data file per day, with hourly tables.  However,
> that will eventually result in 40,000,000+ records to be inserted into a
> single indexed table.  Unfortunately the performance rate of the inserts
> into the indexed tables decreases significantly as the number of records in
> the tables increases.  This seems to be because of a CPU bottleneck rather
> than I/O while doing the searches.

I suspect that SQLite is not a good system for you to be using.  With that kind 
of traffic you're going to benefit a great deal from a database system that 
caches a lot of data between operations.  SQlite does none of this because it 
has to work well on embedded systems that don't have a lot of spare memory.

Take a look at systems with server/client architecture like Postgres or, as 
other people have pointed out, at systems which are just key-value stores 
rather than being SQL engines.

Simon.
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to