Richard wrote:

Hi Don,

This what the text data contains, form importing....

-180,90,NaN ( carriage return )at the end of each record. ( nothing else. )
-179.917,90,NaN
-179.833,90,NaN
. . .

Richard,

Here is a copy of Puneet Kishor's reply to you from yesterday.

On Oct 3, 2005, at 11:44 AM, Richard wrote:

Did this:

sqlite3 test2.db
create Table T (A, B, C );
.separator ,
.import 'sqtest2.txt' T

It looks like it working, but the file size is still
4K and not 170 Megs.

Please note:
I exported this database as a Tab delimiter file,
then as a Comma, delimiter file...

** Got it to import, however it only imports the first record
ie:


it is likely that your line endings are Mac (you are on a Mac, iirc). Open the tab file in a text editor such as BBEdit (or Textwrangler -- free from Barebones) and change the line endings to Unix. Then try again.



sqlite> select * from T;
-180,90,NaN
sqlite>

There's a few more hundred thousands.....
is there suppose to be some kinda of loop statement
like repeat again..

--
Puneet Kishor

As he suggested, you have Mac line endings (single CR) which need to be changed to unix line endings (single LF) or PC line endings (CR LF pairs). The shell reads everything up to the first LF as a line (in your case this is the whole file). It then extracts fields from the line. It recognizes the CR as the end of a line while extracting fields, but not when reading the lines. This could be considered a bug since the code tries to warn you if it finds more fields on the line that it expects. This check is foiled by the CR detection while extracting fields. When it goes to get the next line, it hits the end of file and exits.

For now, use your favorite editor to save the csv file in a suitable format and it should work.

HTH
Dennis Cote


Reply via email to