At 3:56 PM +0000 11/21/05, Tom Brown wrote:
is it possible to do a mysql dump to more than 1 file? We will shortly be needing to dump a db that will be in excess of 50gb so will encounter file size issues

This is on 4.1.x and rhel 4


Probably the best approach - knowing nothing about your db - would be to dump tables to separate files; you could write a pretty simple script to do that.

Since mysqldump writes to stdout, you could pipe to a zip/bzip/gzip, although that's unlikely to compress 50GB down to something most unixes can handle (a safe size is 2GB):

        mysqldump -uuser -p database | gzip > dump.gz

You could pipe to split (try 'man split'), which would split the output into pieces by # of lines or # of bytes (eg; dump.001, dump.002, ...) and then reassemble via cat. It would be nice to do something like

        mysql -uuser -ppassword database < `cat dump.*`

but I don't think that's possible. You'd have to reassemble the dump file first, which means you might run into file size issues again. Probably best to do table-by-table, piping to zip/bzip/gzip as well.

        steve

--
+--------------- my people are the people of the dessert, ---------------+
| Steve Edberg                                http://pgfsun.ucdavis.edu/ |
| UC Davis Genome Center                            [EMAIL PROTECTED] |
| Bioinformatics programming/database/sysadmin             (530)754-9127 |
+---------------- said t e lawrence, picking up his fork ----------------+

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to