Yes, in theory.

However, when about 100 queries are fired in a second, only a few get the data populated..

I can not tell when to start or end the transactions, because events are fired from a third-party system. So I used a timer, set to 5 seconds.. then tried with 10 seconds, got the same result anyway (timer to a procedure that would commit and begin transaction).

From 100 events about 30 were "processed".

About the "cant read while writing", how to avoid this?, I cant stop my system while using the database. There would be no point on having a database then.






----- Original Message ----- From: "Jay Sprenkle" <[EMAIL PROTECTED]>
To: <sqlite-users@sqlite.org>
Sent: Wednesday, July 12, 2006 12:59 PM
Subject: Re: [sqlite] Problems with Multi-Threaded Application.


On 7/12/06, Gussimulator <[EMAIL PROTECTED]> wrote:
Hello,

I'm currently using SQLite3 on my multi-threaded software.

I have tried several ways for dealing with my issue, however, I came to the conclusion that there must be some trick I havent been told of.

It wasn't clear to me when I started that you can have as many readers as you
want, but only one process may write to the database AND no other process
may read it while you are writing. If you use transactions and retry
when the database is locked it works fine and no data will be lost.


--
SqliteImporter and SqliteReplicator: Command line utilities for Sqlite
http://www.reddawn.net/~jsprenkl/Sqlite

Cthulhu Bucks!
http://www.cthulhubucks.com

Reply via email to