I'm thinking whether this is a memory leak or not sort of depends on
your definition.  If a process is designed to remain open for long
periods of time with little activity, and it ends up taking up 1
gigabyte of memory, that looks an awful lot like a leak to me.  There
are likely to be at least three instances of this application running,
and after they all run for a month, they're likely to be consuming 5
gigabytes of memory.  This is not acceptable.  If SQLite's sorted
query is taking up 2.5 megabytes of memory every time this piece of
the application is invoked, I need to know how to ensure that that
memory is released.

Here's a brief description of the application.  My company, Rad-Con,
Inc., is a major supplier of annealing furnaces and related equipment
and software to metal processors worldwide.  The application monitors
the annealing process on a customer's site.  There could be well over
a hundred annealing bases.  The applicaton's first screen displays an
overview of all of the bases, whether they have furnaces, if the
furnaces are turned on, and so on.  A user can double-click on base to
see details.  A button on the detail screen calls up a trend display.
Trend data is stored in SQLite database files, one per base.  The
application executes the query I described to find when the last row
was written to the table, and uses that to calculate the times that
will be displayed on the graph.  Then, the application reads the
entire table and plots the data.  When the user is finished, he closes
the trend screen.  My requirement is to ensure that the amount of
memory allocated to my application before the trend screen is
displayed and after the trend screen is closed is the same.  If more
memory is allocated after it is closed, that is a leak, by my
definition.


RobR


On 3/23/08, Christian Smith <[EMAIL PROTECTED]> wrote:
> On Fri, Mar 21, 2008 at 10:41:10AM -0400, Rob Richardson wrote:
> > My SQLite library is built from the single translation unit
> > sqlite.c/sqlite.h.  That file contains the version number 3.3.17.
> >
> > I do not have valgrind, but circumstantial evidence that this is a
> > SQLite problem is strong.  When stepping through my code, I see that
> > my application's memory jumps by over 2.5 megabytes when the
> > sqlite3_step() method is called when using either the sorted query or
> > the query using max().  The unsorted query doesn't show any memory
> > jump.  Also, the difference in memory consumption before this part of
> > the code is executed and after it is left is the same size as the jump
> > in memory when sqlite3_step() is called.
>
>
> When doing a sorted query, the result set is formed in a temporary database
> somewhere defined by the environment. In your case, it sounds like the
> temporary database is memory based. Once the result set is done with, SQLite
> may return the memory to the OS using free, but that will show under the
> process's virtual memory footprint.
>
> You can tell SQLite to use a disk based temporary database using:
> http://sqlite.org/pragma.html#pragma_temp_store
>
> Using this, your memory usage will probably be more stable.
>
> However, this certainly isn't a memory leak.
>
>
> >
> > RobR
> >
>
> Christian
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>


-- 
Please do not copy or forward this message or any attachments without
my permission.  Remember, asking permission is a great way to get me
to visit your site!
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to