Python with SQLObject would make this simple, you won't even have to write SQL.
Define your data model then split the cvs into a list (array).

for line in file:
       line.split(',')
       Table(col1=line[0],col2=line[1],col3=line[2])

if you have a table with a large amount of columns you could write a
nested "for" loop that iterates over the lines in the file then
iterates over the items in the line. I do this all the time at work to
post-process reports.

On 3/9/07, Dennis Cote <[EMAIL PROTECTED]> wrote:
Jacky J wrote:
> How do you properly escape the endline character when using csv import?
> MySQL for example uses \n, but sqlite puts a linefeed directly into the
> export.  However, each line is delimited by a linefeed, so sqlite will
> get
> confused when it tries to import.  Do i have to resort to insert
> statements
> for this?
>
> Thanks
>
I don't believe you can do this. SQLite's CSV format handling is quite
minimal and I don't believe it handles fields with embedded newline
characters (which are allowed in the CSV file format).

You may want to try some third party import tools (Jay Sprenkle has one)
or you could write your own code to import the file.

HTH
Dennis Cote


-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------




--
William F Pearson III

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to