You could do it by creating a virtual table that parses the file and gives back rows to sqlite.

For example in madIS we have implemented the "file" virtual table:

http://doc.madis.googlecode.com/hg/vtable.html#module-functions.vtable.file

that can present as a table (columns are deduced automatically) to sqlite, local files or remote files (via http). The files may even be gzipped and the "file" operator will uncompress them in a streaming fashion (no temporary local file will be used to uncompress them).

the simplest example would be:

select * from file('data.csv.gz');

(filetype, compression will be deduced automatically from the filename extension).

You can even do streaming XML parsing by piping the file virtual table to xmlparse virtual table:

select * from (xmlparse file 'data.xml');

The funny "xmlparse file" or "operator [space] operator" notation used above, is needed so as to be able to pipe virtual tables. Without it, piping a virtual table to another virtual table would be very hard to write.

lefteris.

On 22/9/2012 10:05 μμ, Sébastien Roux wrote:
Hi is there way of doing a dynamic file import into sqlite without
specifying the table fields. Just specifying the target (temporary?) table
so columns or fileds are created dynamically.

If this cannot be done with cmdline interface maybe using perl amodules
(DBD, DBI)?

Many thanks for any help you could provide.

Sébastien Roux
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users


_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to