Hi Everyone,

I'm new to the list and would like to start by saying hello!  So, hello!

I've been using SQLite3 for a while though to be honest, it's been more 
of a data store rather than a dynamic database in my applications this 
far.  I'm now starting on something where my needs are as such:
* Store 70,000 records in one table in the database
* Based on a list from the user, extract up to 3,000 records from the 
table (1 common field)
* Use the 3,000 records for calculations, etc.
* Support concurrency of up to 10 such operations

The original database is about 8MB on disk.  In simple terms, it may be 
best to insert the 3,000 records from the user into another table and 
then do a join and get the remaining details about the records and use 
them.  I don't yet have an efficient way to do this at run-time, but in 
the past I've imported using a CSV file though it's better to do that in 
memory some other way, I guess.  (I'm sure I should be able to find this 
somewhere).

For performance, I was thinking of using in-memory tables since memory 
is not a concern in the system at the expected concurrency.  I'm just 
wondering what would be the best way to do this.  I'm a bit concerned 
that if I'm creating a table in the in-memory database and inserting 
3000 records into it, it would lock the database, return busy and affect 
the concurrency.

In the past (pre-database days), we've loaded a bunch of records into 
memory and then made a quick copy (memcopy) of the entire stucture in 
memory before starting to edit it.  What would be a good way to achieve 
this with SQLite?

I'd appreciate it if you could even just point me in the correct 
direction.  Any other recommendations would also be greatly appreciated!

Thanks & Best Regards
Mohit.


_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to