Hello all,
  I'm still having just a bit of trouble with execute_array.  I want to parse my input 
file, break it apart with Text::CSV_XS (thanks for the great recommendation!), then 
build arrays corresponding to each column.  Everything works great, it really does.  

  But, I don't want to parse the entire file before I start sending it to the DB.  I'd 
like to send, say, 500 rows at a time, and then commit after every 5000.  So, I used 
the following script:

while (<TEXT_DATA>) {
  if ($csv->parse($_)) {
    my @rets = $csv->fields;
    $colcount = 0;
    foreach my $col (@cols) {
      my $ret = $rets[$colcount];
      if ($ret eq "") { push (@{$insArrays{$colcount}}, undef); }
      else { push (@{$insArrays{$colcount}}, $ret); }
      $colcount++;
    }
    # Do we have the required number of rows?
    if ($insCount == $sendRows) {
      print "\tAdding $insCount rows...\n";
      my $temp_count = 0;
      while ($temp_count < $colttl) {
        # Bind the arrays to the columns...
        $DataInsert->bind_param_array($temp_count +1, [EMAIL PROTECTED]);
        $temp_count++;
      }
      unless ($DataInsert->execute_array( { ArrayTupleStatus => \my @tuple_status } )  
) {
        print LOGFILE "\nErrors were encountered during data load...\n";
        print LOGFILE "errors: $dbh->errstr \n";
      }
      # Set the row count back to zero
      $insCount = 0;
      $temp_count = 0;
      # Clear the arrays out now...
      while ($temp_count < $colttl) {
        $insArrays{$temp_count} = undef();
        # Bind an alternate value to the columns...
        $sth->bind_param_array($temp_count+1, 'NULL');
        $temp_count++;
      }
    }
    if ($commitCount == $commitNum) {
      print LOGFILE "  Sent $commitCount rows of data...\n";
      $dbh->commit; $commitCount = 0;
    }
    $insCount++;
    $commitCount++;
  }
}
# Do we have any rows left over?
if (($insCount > 0) && ($insCount < $sendRows)) {
  # If we're here, we need to insert the remaining rows...
  print "\tAdding $insCount rows...\n";
  my $temp_count = 0;
  while ($temp_count < $colttl) {
    # Bind the arrays to the columns...
    # Fails here, complains that column 1's array is smaller than column 4's...
    $DataInsert->bind_param_array($temp_count +1, [EMAIL PROTECTED]);
    $temp_count++;
  }
  unless ($DataInsert->execute_array( { ArrayTupleStatus => \my @tuple_status } )  ) {
    print LOGFILE "\nErrors were encountered during data load...\n";
    print LOGFILE "errors: $dbh->errstr \n";
  }
  print "\tDone...\n";
  $insCount = 0;
  $temp_count = 0;
  while ($temp_count < $colttl) {
    @{$insArrays{$temp_count}} = undef;
    $temp_count++;
  }
}
if ($commitCount < $commitNum) { $dbh->commit; }

  As you can see in the comments, after the bulk of the file has been passed to the 
DBI, and the remainder gets passed, the DBI complains that the remaining arrays are 
too small.  And the thing is, they ARE smaller, they're the remainder.  Do I need to, 
maybe, commit what I've got so far, close my statement handle, open a new statement 
handle, insert the remainder, and then commit that part?  Boy, it seems like it should 
be easier...

Thanks,
amonotod


--

    `\|||/         amonotod@    | sun|perl|windows
      (@@)         charter.net  | sysadmin|dba
  ooO_(_)_Ooo____________________________________
  _____|_____|_____|_____|_____|_____|_____|_____|

Reply via email to