I'm running the following script on more than 1000 2MB CSV files and I'd like to speed it up if possible. I noticed that a 'WAL' is running. Is there something better I can do to improve this process? Perhaps one transaction? Perhaps turn something off? It took about 1.5 hours to run. I use the temp table because every CSV files has a header with the column names. I have to drop the table each time because of the header issue. I'm using the latest version of SQLite on a fast notebook.

.import 'TP962-A1-P1_TP962-A1-P2_01.CSV' temp_table
delete from temp_table where an = 1; -- using .read del_rec.sql
insert into external_lpr_assay_raw
select *, 'TP962-A1-P1_TP962-A1-P2_01.CSV' as filename, from temp_table;
drop table if exists temp_table; -- using .read drop_table.sql

Joe Fisher
Oregon State University



_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to