On Sun, Jan 19, 2014 at 10:59 PM, Mario M. Westphal m...@mwlabs.de wrote:
If I set wal_autocheckpoint=1, I will get 1/10 of the synchs and WAL
file of about 10 MB, correct?
http://www.sqlite.org/pragma.html#pragma_wal_autocheckpoint states it's a
page count, so that depends on
Unrelated to your question, but, take a look at external content FTS4
table
they dramatically cut down the amount of duplicated data [1])
Thanks for the tip. I'll definitely check that.
Currently I build the contents for FTS from several other tables, combining,
splitting, merging data via SQL
I have a performance effect which I don't quite understand.
Maybe I'm using the wrong settings or something. Sorry for the long post,
but I wanted to include all the info that may be important.
My software is written in C++, runs on Windows 7/8, the SQLite database file
is either on a local SATA
On 19 Jan 2014, at 2:00pm, Mario M. Westphal m...@mwlabs.de wrote:
I logged the execution times of various operations in this phase to a text
file. Everything was fast, the processing, the INSERTs etc.
But COMMIT operations sometimes took 20s, then 0.2s, then again 10s. That's
the time
In WAL mode with synchronous=NORMAL, SQLite only syncs (FlushFileBuffers()
on windows) when it does a checkpoint operation. Checkpoints should be
happening automatically whenever the WAL file exceeds about 1MB in size.
For an 8GB database, probably there are about 8000 sync operations,
On Jan 19, 2014, at 3:00 PM, Mario M. Westphal m...@mwlabs.de wrote:
Also FTS4 is used, which also creates large tables.
(Unrelated to your question, but, take a look at external content FTS4 tableā¦
they dramatically cut down the amount of duplicated data [1])
During an ingest phase, my
If you want to try running with synchronous=NORMAL, you might try setting
PRAGMA wal_autocheckpoint=10; (from the default of 1000) which will
make for dramatically larger WAL files, but also dramatically fewer syncs.
Then the syncs will use just 5 or 6 minutes instead of 4.5 hours.
Unrelated to your question, but, take a look at external content
FTS4 table they dramatically cut down the amount of duplicated data
[1])
Thanks for the tip. I'll definitely check that.
Currently I build the contents for FTS dynamically from several other
tables, combining, splitting,