I may not have expressed myself well, I have to migrate tables between two 
databases that contain millions of rows and I only need specific columns.

 Now it works using placeholders only for the values but is very slow for 
inserts even with AutoCommit disabled.

 Getting data from $src_db is fast but I'm trying to optimize inserts using 
multiple values per insert statement, like: INSERT INTO $ table (column1, 
column2) VALUES ("foo1", "bar1"), ("foo2", "bar2"), .....

the way I do it consumes a lot of memory, not sure what I'm doing wrong but 
would appreciate any advice.
this is what i did: 


my $src_dbh = DBI->connect(
                            
sprintf('dbi:mysql:database=%s;host=%s;port=3306',$src_db{'db'}, 
$src_db{'host'}),
                            $src_db{'user'},
                            $src_db{'password'},
                            {
                              PrintError => 1,
                              RaiseError => 1,
                              mysql_enable_utf8 => 1,
                              mysql_use_result => 1,
                            },
                          )
              or die "Can't connect to database: $DBI::errstr\n";

my $dst_dbh = DBI->connect(
                            
sprintf('dbi:mysql:database=%s;host=%s;port=3306',$dst_db{'db'}, 
$dst_db{'host'}),
                            $dst_db{'user'},
                            $dst_db{'password'},
                            {
                              PrintError => 1,
                              RaiseError => 1,
                              AutoCommit => 0,
                              mysql_enable_utf8 => 1,
                            },
                          )
              or die "Can't connect to database: $DBI::errstr\n";


my $chunksize = 100;

[...]

  while (my @row = $src_sth->fetchrow_array) {
    push @insert_rows, @row;
    if ( ($#insert_rows % $#row ) == ($chunksize - 1) ){ 
      $dst_sth->execute( @insert_rows );
      undef @insert_rows;
    };
  };

[...]
 the @insert_rows array seems to consume inordinate amount of memory. Where can 
I be making the mistake?  


    On Saturday, April 10, 2021, 12:29:18 AM GMT+2, Lawrence Statton 
<lawre...@perl.mx> wrote:  
 
 On 4/9/21 7:11 AM, mailing lists via beginners wrote:
> thanks Andrew
> 
> I need to insert millions of rows so I need to have a good performance 
> using placeholders
> 
> 

You can not use placeholders for the table name.

If you have millions of *tables*, there is something very, very, very 
wrong with your database design.

"Premature optimization is the root of all evil" -- do you have hard 
numeric evidence that the time to create the statement-handle is the 
bottleneck in your system, or are you leaping to the conclusion that you 
"need" to parameterize the table name?

If you have only a few dozen tables, and you need the same statement for 
all of them, then something like

my %sth_for_table;
for my $table qw / foo bar baz waldo pepper salt pork … chains/ {
  $sth_for_table{$table} = $dbh->prepare("whatever whatever into $table");

}

then use the appropriate $sth from the hash for your execute.

That said:  Needing to parameterize the *table name* is an absolute sure 
sign that "You're doing it wrong"


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


  

Reply via email to