Is there some way to load a ".csv" file into a SQLite database table
when the first row (record) of the file contains headers?
This does not work?
----------------------------------------------
.separator ,
.import test.csv ld_assay
Here's my database / table:
CREATE TABLE [ld_assay] (
[id] INTEGER PRIMARY KEY,
[barcode_id] NCHAR(9),
[light] BOOLEAN,
[bpm] REAL);
Here's my "test.csv":
id,barcode_id,light,bpm
1,000002455,1,180.2
2,000002455,1,168.3
3,000002455,1,189.4
----------------------------------------------
This works (example from different program):
----------------------------------------------
When R loads a ".csv" file it uses the [header = TRUE] flag to skip the
first row of the file:
data.frame_ld_assay <- read.table("test.csv", header = TRUE, sep = ",",
colClasses = c("integer", "character", "integer", "numeric"))
Most likely utilizing the "skip, integer: the number of lines of the
data file to skip before beginning to read data." option.
----------------------------------------------
Joe
Oregon State University
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users