Hello all,

I'm writing a script that will read pipe delimited data from a text file
and insert various fields into a Postgres table.  Below is some code I'm
trying to optimize:


while (<FHD>) {
        chomp; #removes \n
        chop; #removes trailing pipe
                        
        @line = split(/\|/, $_, 502); #The line has 502 "fields" so 
                                      #them into an array
        $dbh->do("INSERT INTO cdl_16master VALUES(nextval('cdl_16_seq'),\'" .
join("\',\'",
$line[0],$line[4],$line[5],$line[6],$line[10],$line[11],$line[14],$line[18],$lin
e[22],$line[25]) . "\')");
 $dbh->commit();

} #end while


Just wondering if anyone has a better way of accessing the data in the
array or of storing the few fields I need temporarily until it gets
inserted into the database.

There's a better way to do this, but I'm just not thinking right.....any
suggestions are appreciated.

Thanks,

Kevin
[EMAIL PROTECTED]



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to