[GENERAL] Load a csv file into a pgsql table
Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? Thanks a lot! ---(end of broadcast)--- TIP 5: don't forget to increase your free space map settings
Re: [GENERAL] Load a csv file into a pgsql table
Define 'quick'. You could write a script that would transform a .csv file into an INSERT statement and save it to an .sql file. Or I suppose you could do silly ODBC stuff with MS Access. -- Brandon Aiken CS/IT Systems Engineer -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Emi Lu Sent: Tuesday, September 19, 2006 2:15 PM To: PgSQL General Subject: [GENERAL] Load a csv file into a pgsql table Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? Thanks a lot! ---(end of broadcast)--- TIP 5: don't forget to increase your free space map settings ---(end of broadcast)--- TIP 6: explain analyze is your friend
Re: [GENERAL] Load a csv file into a pgsql table
On Sep 19, 2006, at 11:15 AM, Emi Lu wrote: Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? There are several bulk loaders, but I believe they all use COPY behind the scenes. If copy isn't an option then your best bet will be many inserts in a transaction, but that'll be significantly slower. You could cobble together a loader using perl, DBI and one of the CPAN CSV modules fairly easily. (But I can't think of any reason why you wouldn't use copy, so you must have some constraint you haven't mentioned - can you expand on why copy isn't an option?) Cheers, Steve ---(end of broadcast)--- TIP 3: Have you checked our extensive FAQ? http://www.postgresql.org/docs/faq
Re: [GENERAL] Load a csv file into a pgsql table
On Tue, 2006-09-19 at 13:27, Brandon Aiken wrote: Define 'quick'. You could write a script that would transform a .csv file into an INSERT statement and save it to an .sql file. Or I suppose you could do silly ODBC stuff with MS Access. -- Brandon Aiken CS/IT Systems Engineer -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Emi Lu Sent: Tuesday, September 19, 2006 2:15 PM To: PgSQL General Subject: [GENERAL] Load a csv file into a pgsql table Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? Haven't seen the OP go by, but here's the one of the simplest csv loaders ever created. No guarantees to suitability implied or otherwise. #!/usr/bin/php -q ?php $tablename = $argv[1]; $filename = $argv[2]; if ($argc!=3){ echo Usage:\n\n loadpg tablename filename\n; exit; } if (!file_exists($filename)){ die (given filename doesn't exist\n); } print copy $tablename from stdin;\n; $fp = fopen($filename,r); while(!feof($fp)){ $line = fgetcsv($fp,4096); if (strlen($line)==0) continue(1); print implode(\t,$line); print \n; } print '\.'; print \n; ? Note that you just redirect the output to psql and off you go. ---(end of broadcast)--- TIP 3: Have you checked our extensive FAQ? http://www.postgresql.org/docs/faq
Re: [GENERAL] Load a csv file into a pgsql table
Thank you for all the inputs. Actually, I am reluctant to do the update line by line. I plan to use a shell script to . replace all characters such as ' to \' . update each line to insert into . call -c query load the file into db In java, call this shell script, after data populated into tables, will do other data comparison based on this table then. You could write a script that would transform a .csv file into an INSERT statement and save it to an .sql file. Or I suppose you could do silly ODBC stuff with MS Access. -- Brandon Aiken CS/IT Systems Engineer -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Emi Lu Sent: Tuesday, September 19, 2006 2:15 PM To: PgSQL General Subject: [GENERAL] Load a csv file into a pgsql table Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? Haven't seen the OP go by, but here's the one of the simplest csv loaders ever created. No guarantees to suitability implied or otherwise. #!/usr/bin/php -q ?php $tablename = $argv[1]; $filename = $argv[2]; if ($argc!=3){ echo Usage:\n\n loadpg tablename filename\n; exit; } if (!file_exists($filename)){ die (given filename doesn't exist\n); } print copy $tablename from stdin;\n; $fp = fopen($filename,r); while(!feof($fp)){ $line = fgetcsv($fp,4096); if (strlen($line)==0) continue(1); print implode(\t,$line); print \n; } print '\.'; print \n; ? Note that you just redirect the output to psql and off you go. ---(end of broadcast)--- TIP 9: In versions below 8.0, the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match
Re: [GENERAL] Load a csv file into a pgsql table
Emi Lu wrote: Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? Quick? No. Joshua D. Drake Thanks a lot! ---(end of broadcast)--- TIP 5: don't forget to increase your free space map settings -- === The PostgreSQL Company: Command Prompt, Inc. === Sales/Support: +1.503.667.4564 || 24x7/Emergency: +1.800.492.2240 Providing the most comprehensive PostgreSQL solutions since 1997 http://www.commandprompt.com/ ---(end of broadcast)--- TIP 3: Have you checked our extensive FAQ? http://www.postgresql.org/docs/faq
Re: [GENERAL] Load a csv file into a pgsql table
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 09/19/06 13:26, Steve Atkins wrote: On Sep 19, 2006, at 11:15 AM, Emi Lu wrote: Greetings, *Except* copy command, are there other quick ways to load data from a csv file into a pgsql table please? There are several bulk loaders, but I believe they all use COPY behind the scenes. If copy isn't an option then your best bet will be many inserts in a transaction, but that'll be significantly slower. You could cobble together a loader using perl, DBI and one of the CPAN CSV modules fairly easily. (But I can't think of any reason why you wouldn't use copy, so you must have some constraint you haven't mentioned - can you expand on why copy isn't an option?) COPY has great facilities for specifying the physical layout of the CSV file, but is otherwise limited. Facilities that I miss are: statistics: loaded 1 rows, loaded 2 rows, ... etc. skip: if the COPY dies (or is killed after 10Mn rows have been loaded, it's so useful to be able to add --skip=10425000 to the command and have the bulk loaded quickly scan to that record. Yes, tail(1) can slice off the unloaded records, but that means that now you have 2 files. Messy. transactions: goes hand-in-glove with statistics and skip. exceptions file: if you have a unique index on the table, and one of the input records is a duplicate, kick it out to an exceptions file, note it to stderr and keep on loading. A fields option would also be handy. This is for when the number of fields in the input file does not equal those in the table. Just MHO, of course. - -- Ron Johnson, Jr. Jefferson LA USA Is common sense really valid? For example, it is common sense to white-power racists that whites are superior to blacks, and that those with brown skins are mud people. However, that common sense is obviously wrong. -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.5 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFFEFPsS9HxQb37XmcRAkxpAJ9czWEjP+lYDInS8dVeN9OLYY865wCfU0Fm /Z3FxL6o5XCU3SivPFQDVEc= =K438 -END PGP SIGNATURE- ---(end of broadcast)--- TIP 4: Have you searched our list archives? http://archives.postgresql.org