Since Sqlite caches data in memory you are unlikely to achieve spectacular improvements by having the database memory resident. Perhaps you could investigate performing pre-processing of your data to reduce the time it takes to render your graph.

Rob Richardson wrote:
Greetings!

We are using an SQLite database to store process data that will
eventually be displayed on a graph.  The database design is simple,
including only six tables, but the table containing the data points for
the graph could contain a few million records.  By using the simplest
possible query and asking for the bare minimum of data I need at any one
point, I've managed to get the time to display the graph down from a few
minutes to about 15 seconds for a sample database with 1.3 million
records.
But I'm wondering if I can use an in-memory database to improve this
dramatically.  The data is collected by a Windows service that collects
data and adds it to the database once a minute.  If the service would
also store the data into an in-memory database, and the graphing
application could somehow read the same database, I ought to be able to
get unbelievable speed.  Is this feasible?  If so, how would I set it
up?

Another possibility might be to read the entire database from disk into
an in-memory database when the graphing application starts up, if
there's a way to do that that is much faster than a set of INSERT INTO
newtable SELECT * FROM oldtable (or whatever -- you get the idea)
statements.

Thank you very much.

Rob Richardson
RAD-CON INC.

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------



-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to