Kevin Old wrote:
> 
> Hello all,

Hello,

> I'm writing a script that will read pipe delimited data from a text file
> and insert various fields into a Postgres table.  Below is some code I'm
> trying to optimize:
> 
> while (<FHD>) {
>         chomp; #removes \n
>         chop; #removes trailing pipe
> 
>         @line = split(/\|/, $_, 502); #The line has 502 "fields" so
>                                       #them into an array
>         $dbh->do("INSERT INTO cdl_16master VALUES(nextval('cdl_16_seq'),\'" .
> join("\',\'",
> $line[0],$line[4],$line[5],$line[6],$line[10],$line[11],$line[14],$line[18],$lin
> e[22],$line[25]) . "\')");
>  $dbh->commit();
> 
> } #end while
> 
> Just wondering if anyone has a better way of accessing the data in the
> array or of storing the few fields I need temporarily until it gets
> inserted into the database.
> 
> There's a better way to do this, but I'm just not thinking right.....any
> suggestions are appreciated.


You could use a slice which will reduce the work split has to do and you won't have
to chomp or chop anything off the end.

while (<FHD>) {

    @line = (split /\|/)[0,4,5,6,10,11,14,18,22,25];

    $dbh->do("INSERT INTO cdl_16master VALUES(nextval('cdl_16_seq'),'".join("','", 
@line)."')");

    $dbh->commit();
    }



John
-- 
use Perl;
program
fulfillment

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to