Re: [GENERAL] postgres import

2006-11-02 Thread Albe Laurenz
 psql -h host -p port -d database -U user dump.sql

 It's a good enough solution in most cases, but when the rowcount
 starts to skyrocket, it simply doesn't seem to cut it (at least I
 couldn't make it to).

 INSERT statements? You dumped with the -d flag, didn't you? 

 No I didn't, actually. :) The data was never in the database in the
 first place: it was generated from a different source. True, it was
 generated as a CSV file which I converted into INSERT statements, but
 conversion between the two is not a problem (given 1.5GB of RAM).

Then the best way is to convert it back to a CSV and use the COPY
statement to load in into the table (or \copy from psql).
You don't need any third party tools for that, it's all in PostgreSQL.

Yours,
Laurenz Albe

---(end of broadcast)---
TIP 2: Don't 'kill -9' the postmaster


Re: [GENERAL] postgres import

2006-11-02 Thread Tomi NA

2006/11/2, Albe Laurenz [EMAIL PROTECTED]:

 psql -h host -p port -d database -U user dump.sql

 It's a good enough solution in most cases, but when the rowcount
 starts to skyrocket, it simply doesn't seem to cut it (at least I
 couldn't make it to).

 INSERT statements? You dumped with the -d flag, didn't you?

 No I didn't, actually. :) The data was never in the database in the
 first place: it was generated from a different source. True, it was
 generated as a CSV file which I converted into INSERT statements, but
 conversion between the two is not a problem (given 1.5GB of RAM).

Then the best way is to convert it back to a CSV and use the COPY
statement to load in into the table (or \copy from psql).
You don't need any third party tools for that, it's all in PostgreSQL.


I had a problem with copy, but I can't remember what exactly...come to
think of it, it could have probably done the job...assuming I define
the primary key as DEFAULT nextval('id'), as I had no id in the rows I
was importing...nice to have alternatives. Thanks for the suggestion.

t.n.a.

---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [GENERAL] postgres import

2006-11-01 Thread Alban Hertroys

Tomi NA wrote:

2006/10/31, Albe Laurenz [EMAIL PROTECTED]:

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user dump.sql


It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),


INSERT statements? You dumped with the -d flag, didn't you? Otherwise 
you'd have seen COPY statements instead, which are much faster (and of 
which much fewer are necessary, usually).


--
Alban Hertroys
[EMAIL PROTECTED]

magproductions b.v.

T: ++31(0)534346874
F: ++31(0)534346876
M:
I: www.magproductions.nl
A: Postbus 416
   7500 AK Enschede

// Integrate Your World //

---(end of broadcast)---
TIP 5: don't forget to increase your free space map settings


Re: [GENERAL] postgres import

2006-11-01 Thread Tomi NA

2006/11/1, Alban Hertroys [EMAIL PROTECTED]:

Tomi NA wrote:
 2006/10/31, Albe Laurenz [EMAIL PROTECTED]:
 You feed it to the command line interface psql.

 Example:
 psql -h host -p port -d database -U user dump.sql

 It's a good enough solution in most cases, but when the rowcount
 starts to skyrocket, it simply doesn't seem to cut it (at least I
 couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),

INSERT statements? You dumped with the -d flag, didn't you? Otherwise
you'd have seen COPY statements instead, which are much faster (and of
which much fewer are necessary, usually).


No I didn't, actually. :) The data was never in the database in the
first place: it was generated from a different source. True, it was
generated as a CSV file which I converted into INSERT statements, but
conversion between the two is not a problem (given 1.5GB of RAM).

t.n.a.

---(end of broadcast)---
TIP 6: explain analyze is your friend


[GENERAL] postgres import

2006-10-31 Thread Antonios Katsikadamos
Hi all,I have a little problem. I have an .sql file ( db dump ) and i want to import it to postgres on linux.Does anyone know how i can do it?thnx a lot mates 

We have the perfect Group for you. Check out the handy changes to Yahoo! Groups.


Re: [GENERAL] postgres import

2006-10-31 Thread Albe Laurenz
 I have a little problem. I have an .sql file (  db dump  )  
 and i want to import it to postgres on linux.
  
 Does anyone know how i can do it?

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user dump.sql

Yours,
Laurenz Albe

---(end of broadcast)---
TIP 3: Have you checked our extensive FAQ?

   http://www.postgresql.org/docs/faq


Re: [GENERAL] postgres import

2006-10-31 Thread Tomi NA

2006/10/31, Albe Laurenz [EMAIL PROTECTED]:

 I have a little problem. I have an .sql file (  db dump  )
 and i want to import it to postgres on linux.

 Does anyone know how i can do it?

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user dump.sql


It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),
I used gvim (wonderful tool!) to transform the INSERT statements into
a CSV file and then used an ETL (kettle - another wonderful tool) tool
to import the data into the database.
This could have probably been done much easier: I'd welcome a helpful
hint so as I know next time. :)

t.n.a.

---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Text manipulation tools (was Re: [GENERAL] postgres import)

2006-10-31 Thread Ron Johnson
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 10/31/06 17:17, Tomi NA wrote:
 2006/10/31, Albe Laurenz [EMAIL PROTECTED]:
[snip]
 It's a good enough solution in most cases, but when the rowcount
 starts to skyrocket, it simply doesn't seem to cut it (at least I
 couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),
 I used gvim (wonderful tool!) to transform the INSERT statements into
 a CSV file and then used an ETL (kettle - another wonderful tool) tool
 to import the data into the database.
 This could have probably been done much easier: I'd welcome a helpful
 hint so as I know next time. :)

Unix is chock full of streaming text manipulation tools.  In this
case, awk, Perl or Python would work well.

- --
Ron Johnson, Jr.
Jefferson LA  USA

Is common sense really valid?
For example, it is common sense to white-power racists that
whites are superior to blacks, and that those with brown skins
are mud people.
However, that common sense is obviously wrong.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.5 (GNU/Linux)

iD8DBQFFR+NKS9HxQb37XmcRAo3XAKDr2V7T//IjcRIKoHe6IH01eqrE9gCfe8CO
g8eLsgHs7AtNJT6+F/2Byj4=
=RT7u
-END PGP SIGNATURE-

---(end of broadcast)---
TIP 5: don't forget to increase your free space map settings


Re: [GENERAL] postgres import

2006-10-31 Thread Tom Lane
Tomi NA [EMAIL PROTECTED] writes:
 2006/10/31, Albe Laurenz [EMAIL PROTECTED]:
 psql -h host -p port -d database -U user dump.sql

 It's a good enough solution in most cases, but when the rowcount
 starts to skyrocket, it simply doesn't seem to cut it (at least I
 couldn't make it to).

It certainly should work.  We've seen some platforms where libreadline
seems to be unable to tell the difference between input from a terminal
and input from a file, and performs a boatload of processing that would
be useful for interactive input but is just overhead here.  If that's
your problem, try this form instead:

psql -h host -p port -d database -U user -f dump.sql

regards, tom lane

---(end of broadcast)---
TIP 2: Don't 'kill -9' the postmaster