Re: Backing up large dbs with tar

2006-11-19 Thread Mathieu Bruneau
You have many options like the people below just suggest...

1 - Use mysqldump
2 - Use mysqlhotcopy
or
3 - do the mysqlhotcopy/mysqldump yourself


Since I found that neither 1 nor 2 gives exactly a perfect result in
many backup scheme alone. I started working on something that complement
1 and 2 to do _good_ backup. I finally realize someone had done a
similar job before and actually put more work into it and seems pretty
serious about it!

So you should look at mysql-zrm
(http://www.zmanda.com/backup-mysql.html), The releases are actually
pretty young but everything works quite well and I'm sure this project
is on the good way to gives good backup abilities to mysql. It's simply
using mysqldump and mysqlhotcopy depending of the configuration you
asked for!  It's somewhat the glue around mysqldump/mysqlhotcopy needed
for easy and good backup!

Regards.

-- 
Math
aka ROunofF

==
argontechnologies.ca


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



RE: Backing up large dbs with tar

2006-11-14 Thread Tim Lucia
Is mysqlhotcopy still considered beta?  We steered clear of it for
production use for that reason.

Tim

-Original Message-
From: Dan Buettner [mailto:[EMAIL PROTECTED] 
Sent: Monday, November 13, 2006 12:39 PM
To: Van
Cc: mysql@lists.mysql.com
Subject: Re: Backing up large dbs with tar

Van, I'll second what Gerald said about mysqlhotcopy.

When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing.  Turned out to be due to a 3rd party
backup process attempting to lock and read the database files while
MySQL was attempting to use them.

Using mysqlhotcopy to copy the files elsewhere, and excluding the data
directory from the backup software, gave us a stable solution.

mysqldump might also work well for you, as it can lock
tables/databases and give you a consistent snapshot.  Potentially
takes longer to restore from a mysqldump file though.

HTH,
Dan


On 11/13/06, Van [EMAIL PROTECTED] wrote:
 Greetings:

 I have a 600M data file that never gets backed up.  The following error
 occurs in the cron job:

 tar: /data/mysql/my_db_name/my_large_table_name.MYI: file changed as
we read it

 Is there a way I can set this one table to read-only prior to the backup
 without affecting other db writes during this operation?

 Thanks,
 Van

 --
 MySQL General Mailing List
 For list archives: http://lists.mysql.com/mysql
 To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]




-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



Re: RE: Backing up large dbs with tar

2006-11-14 Thread Dan Buettner

Interesting question - I too noticed that in the comments.  For what
it's worth, I used it in production environment for more than 5 years
with no problems, from 2001 on.  I did restore a few things here and
there, so I know it was working!  ;)

I use mysqldump for backups now because we use InnoDB tables where I'm at now.

Dan

On 11/14/06, Tim Lucia [EMAIL PROTECTED] wrote:

Is mysqlhotcopy still considered beta?  We steered clear of it for
production use for that reason.

Tim

-Original Message-
From: Dan Buettner [mailto:[EMAIL PROTECTED]
Sent: Monday, November 13, 2006 12:39 PM
To: Van
Cc: mysql@lists.mysql.com
Subject: Re: Backing up large dbs with tar

Van, I'll second what Gerald said about mysqlhotcopy.

When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing.  Turned out to be due to a 3rd party
backup process attempting to lock and read the database files while
MySQL was attempting to use them.

Using mysqlhotcopy to copy the files elsewhere, and excluding the data
directory from the backup software, gave us a stable solution.

mysqldump might also work well for you, as it can lock
tables/databases and give you a consistent snapshot.  Potentially
takes longer to restore from a mysqldump file though.

HTH,
Dan


On 11/13/06, Van [EMAIL PROTECTED] wrote:
 Greetings:

 I have a 600M data file that never gets backed up.  The following error
 occurs in the cron job:

 tar: /data/mysql/my_db_name/my_large_table_name.MYI: file changed as
we read it

 Is there a way I can set this one table to read-only prior to the backup
 without affecting other db writes during this operation?

 Thanks,
 Van

 --
 MySQL General Mailing List
 For list archives: http://lists.mysql.com/mysql
 To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]




--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]




--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



Re: Backing up large dbs with tar

2006-11-13 Thread Gerald L. Clark

Van wrote:

Greetings:

I have a 600M data file that never gets backed up.  The following error 
occurs in the cron job:


tar: /data/mysql/my_db_name/my_large_table_name.MYI: file changed as 
we read it


Is there a way I can set this one table to read-only prior to the backup 
without affecting other db writes during this operation?


Thanks,
Van


Look at mysqlhotcopy.


--
Gerald L. Clark
Supplier Systems Corporation

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]



Re: Backing up large dbs with tar

2006-11-13 Thread Dan Buettner

Van, I'll second what Gerald said about mysqlhotcopy.

When we first began using MySQL at my last job, we had terrible
problems with MySQL crashing.  Turned out to be due to a 3rd party
backup process attempting to lock and read the database files while
MySQL was attempting to use them.

Using mysqlhotcopy to copy the files elsewhere, and excluding the data
directory from the backup software, gave us a stable solution.

mysqldump might also work well for you, as it can lock
tables/databases and give you a consistent snapshot.  Potentially
takes longer to restore from a mysqldump file though.

HTH,
Dan


On 11/13/06, Van [EMAIL PROTECTED] wrote:

Greetings:

I have a 600M data file that never gets backed up.  The following error
occurs in the cron job:

tar: /data/mysql/my_db_name/my_large_table_name.MYI: file changed as we 
read it

Is there a way I can set this one table to read-only prior to the backup
without affecting other db writes during this operation?

Thanks,
Van

--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]




--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:http://lists.mysql.com/[EMAIL PROTECTED]