yep im sure.

30 per sec is fine.
parsing time is included. and for real there 30 of files inserted into
databas but for each file there is more inserts. i didn want to make it more
complicated with explaining this.




Richard Hipp-3 wrote:
> 
> On Wed, Nov 9, 2011 at 5:21 PM, yqpl <y...@poczta.onet.pl> wrote:
> 
>>
>> Hi,
>>
>> my task is to parse a lot of files and then insert them to sqlite
>> database.
>> it could be thousands of files. i use c#.
>>
>> im starting a transaction
>> then make a lot of inserts and commit.
>> ive got about 30 inserts per second
> 
> 
> I typically get 100,000 rows per second on a modern workstation, from
> inside a transaction.  Are you *sure* you are using a transaction?
> 
> 
> 
>> but after a while it is dropping to
>> about 1-2 inserts per second. it takse about ~500 inserts to drop to this
>> 1-2 insert per sec.
>>
>> ive got indexes on this database - but this dosnt make difference as i
>> checked on copy without indexes.
>>
>> 1) why it is getting tired ans slows down? how to fix it?
>>
>> o tried to this in a loop with 100 inserts to keep speed reasonable. then
>> closing database and reopening.
>> after close and reopen next commit gets SQLITE_BUSY/* The database file
>> is
>> locked */
>>
>> 2) wtf?
>>
>> please help me im stucked
>>
>> code is like:
>>
>> //done by background worker
>> void import(object sender, DoWorkEventArgs e)
>>        {
>> DataTable tab;
>>            tab = PST_POSTGRES.Postgres.Query(textBox6.Text,
>> textBox1.Text,
>> textBox3.Text, textBox4.Text, textBox5.Text, textBox2.Text);
>>
>>
>>            SQLiteWrapper.SQLite db = new SQLiteWrapper.SQLite();
>>            db.OpenDatabase(sqlite_db);
>>            db.BeginTransaction();
>>
>>
>>            foreach (DataRow r in tab.Rows)
>>            {
>>
>>                if (bw.CancellationPending == true)
>>                {
>>                    e.Cancel = true;
>>                    break;
>>                }
>>                //import here
>>
>>                foreach (object o in imported)
>>                {
>>                    doinserts(o)
>>                }
>>
>>                //here is an reopen condition - when gets slower
>>                if ((((long)(imported * 1000)) /
>> stoper.ElapsedMilliseconds)
>> < next)
>>                {
>>                    db.CommitTransaction();
>>                    db.CloseDatabase();
>>                    db = new SQLiteWrapper.SQLite();
>>                    db.OpenDatabase(sqlite_db);
>>                    db.begintransaction();
>>                }
>>                next = (((long)(imported * 1000)) /
>> stoper.ElapsedMilliseconds);
>>            }
>>
>>            db.CommitTransaction();
>>            db.CloseDatabase();
>>            stoper.Stop();
>>        }
>> --
>> View this message in context:
>> http://old.nabble.com/inserts%2C-performance%2C-file-lock...-tp32814772p32814772.html
>> Sent from the SQLite mailing list archive at Nabble.com.
>>
>> _______________________________________________
>> sqlite-users mailing list
>> sqlite-users@sqlite.org
>> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
>>
> 
> 
> 
> -- 
> D. Richard Hipp
> d...@sqlite.org
> _______________________________________________
> sqlite-users mailing list
> sqlite-users@sqlite.org
> http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
> 
> 

-- 
View this message in context: 
http://old.nabble.com/inserts%2C-performance%2C-file-lock...-tp32814772p32815037.html
Sent from the SQLite mailing list archive at Nabble.com.

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to