I have a benchmark program that demonstrates significant performace
degradation over time.  Here is what this benchmark does:

    1. Create a table with a primary integer key and a blob value.
    2. Populate the table with 1M rows.  In all cases the blobs are strings
       with random lengths between 1 and 1k bytes.
    3. Loop foreer, making passes.  A pass does 1M operations each of which
       is randomly selected and can be either:
        a. an insert of an existing row, chosen at random, or
        b. an assert of a new row as in 2.
        c. Operations are grouped with BEGIN and COMMIT into batches of 1k.

Here are plots of time for each pass on a modern linux machine (fc3, >3GHz
processor, ide drives), and a modern XP machine (similar hardware).

    http://rutt.smugmug.com/photos/23341774-O.jpg

Both show an immediate and steep degradation.  Fine, it would be great to be
able to sustain 52 microseconds/operation, but even 10x that is darn good.
The disturbing thing is that the performance doesn't seem to have stabalized,
but continues to drif upward slowly.

This is using sqlite-3.1.3.  I have attached my benchmark program.  If you
want to run:

    1. Create the table with the sqlite3 command line:

        sqlite3
        create table counters (inx integer primary key, value blob);
        
    2. Run with:

        dftest5 1000000

It will run forever, printing times per pass.

Reply via email to