You have many options like the people below just suggest...
1 - Use mysqldump
2 - Use mysqlhotcopy
or
3 - do the mysqlhotcopy/mysqldump yourself
Since I found that neither 1 nor 2 gives exactly a perfect result in
many backup scheme alone. I started working on something that complement
1 and 2 t
nday, November 13, 2006 12:39 PM
To: Van
Cc: mysql@lists.mysql.com
Subject: Re: Backing up large dbs with tar
Van, I'll second what Gerald said about mysqlhotcopy.
When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing. Turned out to be due to a 3rd par
Is mysqlhotcopy still considered "beta"? We steered clear of it for
production use for that reason.
Tim
-Original Message-
From: Dan Buettner [mailto:[EMAIL PROTECTED]
Sent: Monday, November 13, 2006 12:39 PM
To: Van
Cc: mysql@lists.mysql.com
Subject: Re: Backing up large db
Van, I'll second what Gerald said about mysqlhotcopy.
When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing. Turned out to be due to a 3rd party
backup process attempting to lock and read the database files while
MySQL was attempting to use them.
Using mys
Van wrote:
Greetings:
I have a 600M data file that never gets backed up. The following error
occurs in the cron job:
tar: /data/mysql/"my_db_name"/"my_large_table_name".MYI: file changed as
we read it
Is there a way I can set this one table to read-only prior to the backup
without affect