Hi.  I've started a SQLite C++ project that could peak at 100 million records
(250 bytes per record spread over 20 fields) and would like to ask if anyone
has seen SQLite projects of this magnitude.

The Windows data logging project will add up to 1 million records per day
and run queries containing approximately 20,000 records a few times a day. 

I fully understand that performance will depend on the coding, database
structure and indexing (& hardware) but, assuming these are taken care of,
should a 100 million record table perform loosely in the same performance
class as other popular databases?

My concern is, for example, is hitting a brick wall in performance after 2
million, or whatever, records and so I wanted to post the question here
before I go too far down development road.

I appreciate your input on telling me if you've seen SQLite databases this
large.

Best regards,
Dan Jenkins

-- 
View this message in context: 
http://www.nabble.com/100-million-records-will-be-fine--tp22038526p22038526.html
Sent from the SQLite mailing list archive at Nabble.com.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to