Hi Folks,
sorry if this is a duplicate post, i've been tryin to find a solution of importing data into postgres from a csv file. The problem is, I have a database which consists of columns which contain newline characters (mac and unix). now when i export these files to a csv format, there are some line breaks (mixed unix and mac) in the data which breaks the copy procedure.
I also tried using the script posted in one of the previous posts..
#! /usr/bin/perl
$inquotes = 0;
while (<>){
# Chop the crlf
chop ($_);
chop ($_);
# this first bit goes through and replaces
# all the commas that re not in quotes with tildes
for ($i=0 ; $i < length($_) ; $i++){
$char=substr($_,$i,1);
if ($char eq '"' ){
$inquotes = not($inquotes);
}else{
if ( (!$inquotes) && ($char eq ",") ){
substr($_,$i,1)="~";
}
}
}
# this replaces any quotes
s/"//g;
print "$_\n";
}
cat data_file | perl scriptname.pl > outputfile.dat
and when i run the copy command i get messages like data missing for xyz column.
any possible hints.......
--
Thanks,
Sumeet
- [SQL] Importing data from csv Sumeet
- Re: [SQL] Importing data from csv Phillip Smith
- Re: [SQL] Importing data from csv Scot P. Floess
- Re: [SQL] Importing data from csv Scot P. Floess
- Re: [SQL] Importing data from csv Phillip Smith
- Re: [SQL] Importing data from csv Scot P. Floess
- Re: [SQL] Importing data from csv Michael Fuhr
- Re: [SQL] Importing data from csv Aaron Bono