Yes, that's what I'm after, as I know it will do individual tables ... but I'd like it to do one file for each and every table within each and every database, without having to maintain a batch script with multiple calls to mysqldump specifying them all.
It'd be something like mysqldump -u user -psecret --all-databases --file-per-table --output-dir=/path/to/backups Dan On 7/7/06, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
mysqldump will dump a database or just a table, just depends on what you specify. mysqldump [OPTIONS] database [tables] of course, if you want to automate this and don't know the table (or database) names in advance you'd need to do something (e.g., a mysqlshow) to get that list first. is that what you're after, or am i missing something? - Rick ------------ Original Message ------------ > Date: Friday, July 07, 2006 02:53:11 PM -0500 > From: Dan Buettner <[EMAIL PROTECTED]> > To: mysql@lists.mysql.com > Subject: mysqldump - dump file per table? > > I'm preparing to implement some mysqldump-based backups, and would > really like to find an easy way to dump out one SQL file per table, > rather than single massive SQL file with all tables from all > databases. > > In other words, if I have database DB1 with tables TBL1 and TBL2, and > database DB2 with tables TBL3 and TBL4, I'd end up with files named > something like this, containing just the table create and data for > each: > > 20060707.DB1.TBL1.sql > 20060707.DB1.TBL2.sql > 20060707.DB2.TBL3.sql > 20060707.DB2.TBL4.sql > > This would make selective restores a lot easier, and would also allow > us to set up development/testing environments more easily than one big > file. > > I'd use mysqlhotcopy but we're in an InnoDB environment. > > I can implement this with a little perl script but wondered if anyone > was aware of a tool out there already? > > Dan > ---------- End Original Message ----------
-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]