You can get the underlying SQLite version from DBD-SQLite as $dbh->{sqlite_version}; Make sure you set $dbh->{AutoCommit=>0}. This will ensure you're always in a transaction. Without it, you're probably committing every row. From what I can tell, you can twiddle AutoCommit at any point in the program to turn it on (1), or off (0).
Using $dbh->{AutoCommit=>0}, and a DBI prepared statement, I see inserts read from a file running somewhere around 5K/second on a 733MHz P3 (Linux). I have a daily process that imports ~9,900 records in 2.077 seconds I think the docs for DBD-SQLite indicate a writer always locks the whole file, so you probably can't do dirty reads with DBD-SQLite. Once you start writing, the reader won't have access to the DB until the writer commits or rolls-back. -Clark ----- Original Message ---- From: Sripathi Raj <[EMAIL PROTECTED]> To: sqlite-users@sqlite.org Sent: Tuesday, April 4, 2006 4:18:35 PM Subject: Re: [sqlite] DBD Sqlite On 4/4/06, Nathan Kurz <[EMAIL PROTECTED]> wrote: > > > >> 3. The performance for inserts is really bad. Around 40k entries > takes a > > >> few hours. What might I be doing wrong? I do a commit after > > >> all the inserts. > > > > > > A few things to help with speed: > > > > > > 1. Use DBI's prepared statements; eg, 1 prepare() and many execute(). > > > > Yes, this is what I do. > > > > > > 2. Don't commit for each row inserted but batch them so, say, you > > > commit once per 1000 rows. > > > > > Unfortunately, I cannot commit till I do all the inserts. > > That doesn't seem right for speed. In addition to using "commit", are > you beginning a transaction with "begin"? Are your inserts > particularly complex or large? More details about what you are doing > would probably be good here, since something odd is happening here. > Maybe you could post a tiny test program along with the time it takes? > > --nate > > I don't begin the transaction with begin. My assumption was that the first insert operation would automatically begin a transaction. My inserts are fairly simple with two columsn being long strings of length 255. my @values = ($task_info_gid,$file_type_gid,$extracted_path,$media_path, $size,$ctime,$mtime,$job_id,$is_in_du); Raj