I'm curious if anyone has any thoughts on how I can handle a data retention issue I've been facing with a larger dataset.

My process, roughtly is run mysqldump against the DB and gzip the output. Then I transfer it off to a different machine for archival onto optical media (yes I know there's a shelf of the media life involved there). This is a secondary backup to the backup I'm doing for system recovery purposes - not the only method of backup, but I don't have access to that storage system as it's provided via the ISP. Recently the file has gotten so big it doesn't fit on a standard DVD-R media any longer. I've considered only backing up key data, but for the archive it's much more convenient to have the entire structure intact in one location - additionally, we occasionally build a test machine from this data so it's integrity is moderately important. Ideally, I'd like to be able to script the compression and slicing right on the server that does the backup, but I realize this may not be possible.

I'm looking at using RAR to archive the output file and slice it into smaller segments so I can distribute the data over 2 DVD-R (or multiple CD-R's if so desired.)

Does anyone have any feedback on this approach? Without laying out money for a higher capacity DVD/BlueRay drive or tape backup system, what options are you using for data retention?


have a look at the split command so that the output goes into more than 1 file

eg

mysqldump -e -u user -ppassword databasename | split -b 1024m


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to