I'm working on some software that will generate all sorts of statistics
on large volumes of data. I'm looking for data to experiment on. I'd
like 100 million - 1 billion records. These records can be anything
from log files to database records, or whatever. I just need tons of data.
If you
Hello,
Does anyone know how to enable concurrent_insert with mysql version
4.0.x. I can't seem to turn it on no matter what I do.
Mark
-
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
If the table has a unique key then add the REPLACE keyword.
This will update the columns with new values if the key already exists
and add the row if it doesnt.
--Mark
Michael Kaiser wrote:
Using the following inserts data from a text file into a particular MySQL
table:
LOAD DATA LOCAL
I do this from the command line sometimes:
mysql -h HOST -pPASSWORD DATABASE statements.sql
The statements.sql file should contain full sql statements. This will
read in the file and execute the sql 1 line at a time.
The opposite would be to use mysqldump like so:
mysqldump -h HOST
I just wanted to say that MySQL absolutely rocks!
We are using Mysql with a very large database and it's working
beautfully. Needless to say we will be buying licenses to further
support mysql.
Here are some rough stats on what we were doing:
Inserts: 13,000,000 per day
Updates: 12,000,000