Re: [sqlite] Importing a big text file (CSV?)

2007-07-20 Thread Alberto Simões

On 7/19/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:

"=?ISO-8859-1?Q?Alberto_Sim=F5es?=" <[EMAIL PROTECTED]> wrote:
> Ok, for future reference (drh, please, it would be nice to add this to
> the web site)

That is why we have wiki (http://www.sqlite.org/cvstrac/wiki) so
that you can add things like this yourself.


We have a wiki?
Nice :)

--
Alberto Simões

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread drh
"=?ISO-8859-1?Q?Alberto_Sim=F5es?=" <[EMAIL PROTECTED]> wrote:
> Ok, for future reference (drh, please, it would be nice to add this to
> the web site)

That is why we have wiki (http://www.sqlite.org/cvstrac/wiki) so
that you can add things like this yourself.  

I'm busy trying to fix database corruption bugs (like ticket #2518).

;-)


> 
> To import:
>   3  5  6
>   3  4  6
> 
> CREATE TABLE foo (v1,v2,v3);
> .separator " "
> .import "file.dat" foo
> 
> 
--
D. Richard Hipp <[EMAIL PROTECTED]>


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread P Kishor

On 7/19/07, Veikko Mäkinen <[EMAIL PROTECTED]> wrote:

Alberto Simões wrote:
> Hi
>
> I have a file (big file with 16 000 000 lines) with records like
>
> 2 3 4
> 4 3 2
> 5 4 387
> 5 8 5473
> ...
>
> and I want to import this to an SQLite table.
> Although I can replace all this to INSERT commands very easily, I
> would like to ask first if there is any faster method.

I insert 150 000 records with a prepared statement in one single
transaction and it takes about 12 secs. I think that's fast enough :)
Mind you my table doesn't have indexes. Indexes make inserting notably
slower.



337 seconds on my Macbook Pro

lucknow:~/Data/temp punkish$ sqlite3 test.db
SQLite version 3.3.8
Enter ".help" for instructions
sqlite> create table test (a, b, c);
sqlite> .q
lucknow:~/Data/temp punkish$ vim test.pl
#!/usr/local/bin/perl -w
use strict; use DBI; use Benchmark;

my $t0 = new Benchmark;
my $dbh = DBI->connect(
 "dbi:SQLite:dbname=test.db", "", "", { RaiseError => 1, AutoCommit => 0 }
);

my $sth = $dbh->prepare(qq{INSERT INTO test (a, b, c) VALUES (?, ?, ?)});
for (1 .. 1600) { $sth->execute($_, $_, $_); }
$dbh->commit;

my $t1 = new Benchmark;
print "This took " . timestr( timediff($t1, $t0) ) . "\n";

lucknow:~/Data/temp punkish$ ls
test.db test.pl
lucknow:~/Data/temp punkish$ perl test.pl
This took 337 wallclock secs (279.02 usr + 15.67 sys = 294.69 CPU)
lucknow:~/Data/temp punkish$ ls -l
-rw-r--r--   1 punkish  punkish  542086144 Jul 19 11:29 test.db
-rw-r--r--   1 punkish  punkish432 Jul 19 11:23 test.pl
lucknow:~/Data/temp punkish$ sqlite3 test.db
SQLite version 3.3.8
Enter ".help" for instructions
sqlite> select count(*) from test;
1600
sqlite>

--
Puneet Kishor http://punkish.eidesis.org/
Nelson Inst. for Env. Studies, UW-Madison http://www.nelson.wisc.edu/
Open Source Geospatial Foundation http://www.osgeo.org/education/
S&T Policy Fellow, National Academy of Sciences http://www.nas.edu/
-
collaborate, communicate, compete
=

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread Alberto Simões

Ok, for future reference (drh, please, it would be nice to add this to
the web site)

To import:
 3  5  6
 3  4  6

CREATE TABLE foo (v1,v2,v3);
.separator " "
.import "file.dat" foo


Cheers
Alberto

On 7/19/07, Yusuke ITO <[EMAIL PROTECTED]> wrote:

Hi,

COPY command (like PostgreSQL)
http://www.sqlite.org/lang_copy.html

COPY tbl_foo (col1, col2, col3) FROM stdin;
2   3   4
4   3   2
5   4   387
5   8   5473
\.


--
Yusuke ITO
[EMAIL PROTECTED]

On Thu, 19 Jul 2007 13:01:53 +0200
"Sylko Zschiedrich" <[EMAIL PROTECTED]> wrote:
> We are using precompiled insert statements and bind the parameters.
> The inserts were done in a transaction that is committed and reopened every
> 1000 iterations.
>
> Ciao
> Sylko
>
> -Urspr〓gliche Nachricht-
> Von: Alberto Sim〓s [mailto:[EMAIL PROTECTED]
> Gesendet: Donnerstag, 19. Juli 2007 11:57
> An: sqlite-users@sqlite.org
> Betreff: [sqlite] Importing a big text file (CSV?)
>
> Hi
>
> I have a file (big file with 16 000 000 lines) with records like
>
> 2 3 4
> 4 3 2
> 5 4 387
> 5 8 5473
> ...
>
> and I want to import this to an SQLite table.
> Although I can replace all this to INSERT commands very easily, I
> would like to ask first if there is any faster method.
>
> Cheers
> Alberto
> --
> Alberto Sim〓s
>
> -
> To unsubscribe, send email to [EMAIL PROTECTED]
> -
>
>
> -
> To unsubscribe, send email to [EMAIL PROTECTED]
> -




-
To unsubscribe, send email to [EMAIL PROTECTED]
-





--
Alberto Simões


RE: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread Griggs, Donald
Regarding: "Meanwhile I found an '.import' command on SQLite, but I
can't find a suitable documentation on how it works."


It can be easy to miss page:  http://www.sqlite.org/sqlite.html
where this is documented. 

Basically, it sounds like you might want to invoke the command line
utility, sqlite3, then :
.separator ' '
.import myfile.txt  mytable

(Note that there must be no terminating semicolon on these dot commands)
This assumes you have precisely ONE space between each and every value.

With 16 million lines, you may have some subtler issues.
You'll want to remove any indices and recreate them after the import,
for example.


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread Veikko Mäkinen

Alberto Simões wrote:

Hi

I have a file (big file with 16 000 000 lines) with records like

2 3 4
4 3 2
5 4 387
5 8 5473
...

and I want to import this to an SQLite table.
Although I can replace all this to INSERT commands very easily, I
would like to ask first if there is any faster method.


I insert 150 000 records with a prepared statement in one single 
transaction and it takes about 12 secs. I think that's fast enough :) 
Mind you my table doesn't have indexes. Indexes make inserting notably 
slower.



-veikko


-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread Yusuke ITO
Hi, 

COPY command (like PostgreSQL)
http://www.sqlite.org/lang_copy.html

COPY tbl_foo (col1, col2, col3) FROM stdin;
2   3   4
4   3   2
5   4   387
5   8   5473
\.


--
Yusuke ITO
[EMAIL PROTECTED]

On Thu, 19 Jul 2007 13:01:53 +0200
"Sylko Zschiedrich" <[EMAIL PROTECTED]> wrote:
> We are using precompiled insert statements and bind the parameters.
> The inserts were done in a transaction that is committed and reopened every
> 1000 iterations.
> 
> Ciao
> Sylko
> 
> -Urspr〓gliche Nachricht-
> Von: Alberto Sim〓s [mailto:[EMAIL PROTECTED] 
> Gesendet: Donnerstag, 19. Juli 2007 11:57
> An: sqlite-users@sqlite.org
> Betreff: [sqlite] Importing a big text file (CSV?)
> 
> Hi
> 
> I have a file (big file with 16 000 000 lines) with records like
> 
> 2 3 4
> 4 3 2
> 5 4 387
> 5 8 5473
> ...
> 
> and I want to import this to an SQLite table.
> Although I can replace all this to INSERT commands very easily, I
> would like to ask first if there is any faster method.
> 
> Cheers
> Alberto
> -- 
> Alberto Sim〓s
> 
> -
> To unsubscribe, send email to [EMAIL PROTECTED]
> -
> 
> 
> -
> To unsubscribe, send email to [EMAIL PROTECTED]
> -




-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread P Kishor

On 7/19/07, Alberto Simões <[EMAIL PROTECTED]> wrote:

Hi

I have a file (big file with 16 000 000 lines) with records like

2 3 4
4 3 2
5 4 387
5 8 5473
...

and I want to import this to an SQLite table.
Although I can replace all this to INSERT commands very easily, I
would like to ask first if there is any faster method.




Try the .import command. That does the job very fast.

--
Puneet Kishor http://punkish.eidesis.org/
Nelson Inst. for Env. Studies, UW-Madison http://www.nelson.wisc.edu/
Open Source Geospatial Foundation http://www.osgeo.org/education/
S&T Policy Fellow, National Academy of Sciences http://www.nas.edu/
-
collaborate, communicate, compete
=

-
To unsubscribe, send email to [EMAIL PROTECTED]
-



Re: [sqlite] Importing a big text file (CSV?)

2007-07-19 Thread Alberto Simões

I have a file (big file with 16 000 000 lines) with records like

2 3 4
4 3 2
5 4 387
5 8 5473
...

and I want to import this to an SQLite table.
Although I can replace all this to INSERT commands very easily, I
would like to ask first if there is any faster method.


Meanwhile I found an '.import' command on SQLite, but I can't find a
suitable documentation on how it works.

--
Alberto Simões

-
To unsubscribe, send email to [EMAIL PROTECTED]
-