Something related, but that doesn't really answer the question: if you
want to populate a database with so many rows, to speed up things a lot
you should embed them into a transaction (or in a small number of
transactions). This way, if sqlite works synchronously, it doesn't need to
flush data onto disk for every insert statement. I think it will work
10-100 times faster.
On Wed, 1 Feb 2006, deBooza (sent by Nabble.com) wrote:
Hi
I'm using sqlite in a c++ program, using functions sqlite3_open, sqlite3_exec,
sqlite3_close etc.
I am trying to import a lot of data, probably around 100,000 rows+. I have
found it
quicker if I format the data into SQL statements and then use
the shell statement .read to read in the SQL from file and
populate the database, similar to the following:
.read c:\tmp\sql.txt
While this is a solution it's not an ideal one. I would much
prefer to do it programatically, something like this:
void main()
{
while(not end of data)
{
Write data to file
}
ExecuteSQLFromFile( SQLFile )
}
Does anyone know how to do this?
What is the equivelent to .read?
Or does anyone know of a better way?
DFB
--
View this message in context:
http://www.nabble.com/Executing-SQL-from-file-t1040732.html#a2702793
Sent from the SQLite forum at Nabble.com.