Eric JESOVER wrote:
How can I limit the amount of ram sqlite use?
I have a 4 MB database after some select insert the amount of ram using by
the process is 100 MB.
I want to limit this size to something like 6 or 8 MB.
There is a flag somewhere?
I understand this will cause some performance issue but this not a problem
for my application.


Any update to the database requires 1 byte of RAM for every 4K of database. For your 4MB database, that is only 1K of RAM and is not a factor. (But it can be a factor for that 10GB databases I keep hearing about!)

If you use an ORDER BY clause that cannot be satisfied by an
index, then the sort occurs in RAM.  That means the entire
results set must fit in RAM.  This only occurs if there are
no indices available which would allow the result set to be
pulled from the database in sorted order.

The entire results set of any query that uses aggregate functions
must fit in RAM.  Normally this is not a problem since queries
that use aggregates typically have small result sets.

If you use the sqlite_get_table() API, the entire result set is
stored in RAM and returned to you as a pointer.

The pager layer for the database will use up to about 1.5MB of
RAM to buffer database pages.  You can lower this using the
"cache_size" pragma if you want.  A smaller cache size might
result in slower access.


-- D. Richard Hipp -- [EMAIL PROTECTED] -- 704.948.4565


--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to