A good approach is to to define your dataset.  You are just
using a CSV so     I would do the following:  1) load CSV record into a
data structure (hashes are great for this).2) insert record into table
by way of a generic loader method. -- code snippet --use strict;use
Text::CSV;  #-- I use this because I am familiar with it.use DBI;my $dbh
= DBI->connect($data_source, $username, $auth, \%attr);my $csv =
Text::CSV->new(); #-- create CSV parsing object.#-- the file.open(FILE,
datafile.txt) || die $!;while(<FILE>){  chomp; 
insertRecord(buildRecord($_)) || die some error occured; }sub
buildRecord {  my $line = shift;  my $tbl_nam = 'table';  my @fields =
qw(field1 field2 field3);  my $record = {};  $csv->parse($line);  my
@data = $csv->fields();  for(my $i=0; $i<@fields; $i++ ){   
$record->{$tbl_name}->{$_} = $data[$i];  }}sub insertRecord {  my
$record = shift;  my $tbl_nam = keys %{$record};  #-- build sql  my $sql
= INSERT INTO $tbl_nam (;  my $columns;  my $placeholder;  my @data; 
foreach my $col (keys %{$record->{$tbl_nam}} ){    $columns .= $col, ;  
 $placeholder .= '?, ';    push(@data, $record->{$tbl_nam}->{$col});  } 
$columns =~ s/, $//;  $placeholder =~ s/, $//;  my $dbh->do(INSERT INTO
$tbl_nam ($columns) values($placeholders), undef,     @data) || die
$dbh->errstr();} -- end code snippet --this code has not been compiled,
so you may find some syntax problems.  I     think the general logic is
there though.  Obviously some things can be     done to optimize the
code.  I have a Db.pm I use to handle some of the     optimization and
don't want to go into detail here.  Also, you can change     your
buildRecord method to be more generic by passing in the fields.     
Also, references could be used a little more extenisively to eliminate  
  pass by value.  Don't forget to commit your data if you don't have    
autocommit on.

>>>NIPP, SCOTT V (SBCSI) <[EMAIL PROTECTED]> 09/07 7:04 am >>>
I have a table with 22 columns, and my INSERT on this table is
actually putting in data to possibly all 22 fields. My question is, is
there a better way to write this INSERT? Here's what I have:

    while ($line = <DATA>) {
$csv->parse($line);
@data =     $csv->fields;
$sth = $dbh->prepare(qq{INSERT INTO AllMid_Data
VALUES(?,?,?,?,?,?,?,?,?,?,?,?
,?,?,?,?,?,?,?,?,?,?)}) or print Error with     INSERT _prepare_:
$DBI::errstr\n;


   
$sth->execute(undef,$data[0],$data[1],$data[2],$data[3],$data[4],$data[5
],$dat
   
a[6],$data[7],$data[8],$data[9],$data[10],$data[11],$data[12],$data[13],
$data[14
],$data[15],$data[16],$data[17],$data[18],$data[19],$data[20]) or    
print
Error with
INSERT _execute_: $DBI::errstr\n;

}

The 'execute' statement is quite long, and almost all of the
data comes from a single array. I don't think this is a big deal in and
of itself, just looking for a better way to do this. Thanks.

    Scott Nipp
Phone: (214) 858-1289
E-mail: [EMAIL PROTECTED]
Web: http:\\ldsa.sbcld.sbc.com


  

Reply via email to