I do not believe this is currently an option in the `load data infile`
syntax. One option would be to read the file programmatically and issue the
commits after `x` number of inserts.

-----Original Message-----
From: Michael Lee
To: [EMAIL PROTECTED]
Sent: 6/28/04 1:21 AM
Subject: INNODB transaction log size

Hi,
 
I would like to migrate my DB from Sybase ASE to MySQL INNODB table.
Data has been extracted and stored as a file. I want to use the command
Load Data Infile to insert the data to MySQL. However, some table
contains millions of rows. Can i control the batch size of the loading
(e.g. commit the transaction after 50000 rows inserted)?
 
If no, should i define a very large transaction log to handle the huge
transaction? (currently, it is 5M)
 
Any suggestion is welcomed.
 
TIA
Michael 

???????????????...
??????????
http://mobile.yahoo.com.hk/

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to