Greetings!
I have a Windows service application written in C# using Visual Studio 2008
that uses the ADO.Net 2.0 provider for SQLite downloaded from SourceForge. The
application writes one row consisting of three values to each of 124 SQLite
databases once a minute. The service uses the SQLiteDatabase.ExecuteNonQuery()
method to write the data into the databases. As services usually are, this one
is designed to run forever. The problem is that it slowly increases memory
consumption. When it starts, Task Manager reports that it uses 34 megabytes of
memory. That number goes up by about 3 megabytes per hour.
When I comment out the call to ExecuteNonQuery(), the memory consumption of the
service remains constant.
This behavior is not acceptable. I can't have a service whose memory footprint
grows without limit, no matter how slowly it grows. I would have expected that
the amount of memory consumed by my application before the call to
ExecuteNonQuery() would be nearly the same is the amount consumed after the
call finishes, with possibly some difference because I can't be sure when C#'s
garbage collection will do its work. What am I doing wrong? Is this a problem
in the provider? Is this a problem with SQLite itself? Should I be using ADO
.Net more correctly?
Here's the function:
public int Execute(string query)
{
int rowsAffected;
if (m_command == null)
{
System.Data.SQLite.SQLiteException ex = new
SQLiteException("Attempt to execute a query on a closed or broken database.");
throw ex;
}
m_command.CommandType = CommandType.Text;
m_command.CommandText = query;
rowsAffected = 0;
// rowsAffected = m_command.ExecuteNonQuery();
return rowsAffected;
}
Thanks for your help!
RobR
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users