Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-06-05 Thread Rich
On 2007.05.23. 11:53, Marc wrote:
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump and
 backup these dump files? Because of database sizes, it would be very nice if
 this can be done database by database. I mean first do the dump of database
 1, move it to bacula, remove the dump, dump of database 2, move it to
 bacula, etc...
  
 Can this be done?

i didn't see whether you successfully resolved your problem, so here's a 
quick  crude script, which backs up each database as a separate file, 
and does this for all of the databases.
script does not protect from simultaneous runs, does not compress dumps 
(as that is done by bacula job) - though all this is very easy to add.

for bacula job, just add something like ;

   ClientRunBeforeJob = /scripts/mysqlbackup create
   ClientRunAfterJob = /scripts/mysqlbackup remove

note, the script probably has several problems, so feel free to correct 
those ;)

-
#!/bin/bash

HOME=/root
DUMPDIR=/var/tmp/database_dump
MYSQLDUMP=/usr/local/mysql/bin/mysqldump
DUMPCOMMAND=$MYSQLDUMP --add-drop-database --add-drop-table --add-locks 
--extended-insert\
  --single-transaction --quick

fail() {
 echo failure : $1
 exit 1
}

create() {
 if [ ! -d $DUMPDIR ]; then
 mkdir -p $DUMPDIR || fail unable to create directory $DUMPDIR
 fi

 for i in `echo show databases | mysql -N`; do
 $DUMPCOMMAND $i  $DUMPDIR/$i || fail unable to dump 
database $i
 done
}

remove() {
 rm $DUMPDIR/* || fail unable to remove db dumps
}

case $1 in
 create)
 create
 ;;
 remove)
 remove
 ;;
 *)
 fail pass either create or remove
esac


 Kind regards,
 Marc
-- 
  Rich

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-06-05 Thread Marc
Thanks, Rich,

I'm going to test this script and see how it goes. I hope I have enough
diskspace to hold all the dumps during the backup.

Kind regards,
Marc 

-Oorspronkelijk bericht-
Van: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Namens Rich
Verzonden: dinsdag 5 juni 2007 10:04
Aan: bacula-users
Onderwerp: Re: [Bacula-users] FW: Backup of a mysql dump of the database

On 2007.05.23. 11:53, Marc wrote:
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump 
 and backup these dump files? Because of database sizes, it would be 
 very nice if this can be done database by database. I mean first do 
 the dump of database 1, move it to bacula, remove the dump, dump of 
 database 2, move it to bacula, etc...
  
 Can this be done?

i didn't see whether you successfully resolved your problem, so here's a
quick  crude script, which backs up each database as a separate file, and
does this for all of the databases.
script does not protect from simultaneous runs, does not compress dumps (as
that is done by bacula job) - though all this is very easy to add.

for bacula job, just add something like ;

   ClientRunBeforeJob = /scripts/mysqlbackup create
   ClientRunAfterJob = /scripts/mysqlbackup remove

note, the script probably has several problems, so feel free to correct
those ;)

-
#!/bin/bash

HOME=/root
DUMPDIR=/var/tmp/database_dump
MYSQLDUMP=/usr/local/mysql/bin/mysqldump
DUMPCOMMAND=$MYSQLDUMP --add-drop-database --add-drop-table --add-locks
--extended-insert\
  --single-transaction --quick

fail() {
 echo failure : $1
 exit 1
}

create() {
 if [ ! -d $DUMPDIR ]; then
 mkdir -p $DUMPDIR || fail unable to create directory $DUMPDIR
 fi

 for i in `echo show databases | mysql -N`; do
 $DUMPCOMMAND $i  $DUMPDIR/$i || fail unable to dump database
$i
 done
}

remove() {
 rm $DUMPDIR/* || fail unable to remove db dumps
}

case $1 in
 create)
 create
 ;;
 remove)
 remove
 ;;
 *)
 fail pass either create or remove
esac


 Kind regards,
 Marc
--
  Rich

-
This SF.net email is sponsored by DB2 Express Download DB2 Express C - the
FREE version of DB2 express and take control of your XML. No limits. Just
data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users



-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Marc
Hi,
 
I've searched the archives, but I can not find any relevant information.
Therefor my question: is it possible to do a database by database dump and
backup these dump files? Because of database sizes, it would be very nice if
this can be done database by database. I mean first do the dump of database
1, move it to bacula, remove the dump, dump of database 2, move it to
bacula, etc...
 
Can this be done?
 
Kind regards,
Marc



-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Ludovic Strappazon
Hi Marc,

You just have to run a job for each database with a runbeforejob and 
runafterjob as done by the BackupCatalog job.

Regards,
Ludovic Strappazon. 

Marc a écrit :
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump and
 backup these dump files? Because of database sizes, it would be very nice if
 this can be done database by database. I mean first do the dump of database
 1, move it to bacula, remove the dump, dump of database 2, move it to
 bacula, etc...
  
 Can this be done?
  
 Kind regards,
 Marc



 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users

   


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Jerome Massano
Hi

You can do a database by database dump with mysqldump (see man
mysqldump). I suppose you could do what you want by setting multiple
jobs:

First job has :
Client Run Before Job = mysqldump database1  file-1
Fileset = file-1-fs
Client Run After Job = rm file-1

Second job has :
Client Run Before Job = mysqldump database2  file-2
Fileset = file-2-fs
Client Run After Job = rm file-2

etc... Of course you will have to configure file-1-fs and file-2-fs to
backup file-1 and file-2.

I think the syntax is false because I did it without looking at the
documentation, but the idea is here.

If you are using another database, i think there would be an equivalent
to mysqldump.

Hope that will help

PS : sorry for my really bad english ^^


Le mercredi 23 mai 2007 à 10:53 +0200, Marc a écrit :
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump and
 backup these dump files? Because of database sizes, it would be very nice if
 this can be done database by database. I mean first do the dump of database
 1, move it to bacula, remove the dump, dump of database 2, move it to
 bacula, etc...
  
 Can this be done?
  
 Kind regards,
 Marc
 
 
 
 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Marc
I have to maintain a server park with 500+ databases, and databases being
remove and added every day. So defining a job for each database is not an
option.

Creating a backup of the database files is not what I'm looking for, as I
will be unable to restore Inno databases. Inno databases are stored in one
file, not separate files / directories as MyISAM.

Regards,
Marc  

-Oorspronkelijk bericht-
Van: Jerome Massano [mailto:[EMAIL PROTECTED] 
Verzonden: woensdag 23 mei 2007 11:20
Aan: Marc
CC: bacula-users@lists.sourceforge.net
Onderwerp: Re: [Bacula-users] FW: Backup of a mysql dump of the database

Hi

You can do a database by database dump with mysqldump (see man mysqldump). I
suppose you could do what you want by setting multiple
jobs:

First job has :
Client Run Before Job = mysqldump database1  file-1 Fileset = file-1-fs
Client Run After Job = rm file-1

Second job has :
Client Run Before Job = mysqldump database2  file-2 Fileset = file-2-fs
Client Run After Job = rm file-2

etc... Of course you will have to configure file-1-fs and file-2-fs to
backup file-1 and file-2.

I think the syntax is false because I did it without looking at the
documentation, but the idea is here.

If you are using another database, i think there would be an equivalent to
mysqldump.

Hope that will help

PS : sorry for my really bad english ^^


Le mercredi 23 mai 2007 à 10:53 +0200, Marc a écrit :
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump 
 and backup these dump files? Because of database sizes, it would be 
 very nice if this can be done database by database. I mean first do 
 the dump of database 1, move it to bacula, remove the dump, dump of 
 database 2, move it to bacula, etc...
  
 Can this be done?
  
 Kind regards,
 Marc
 
 
 
 --
 --- This SF.net email is sponsored by DB2 Express Download DB2 Express 
 C - the FREE version of DB2 express and take control of your XML. No 
 limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users




-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Ludovic Strappazon

Marc a écrit :

I have to maintain a server park with 500+ databases, and databases being
remove and added every day. So defining a job for each database is not an
option.
  

I can't imagine a way to back them up one by one in a single job.

Creating a backup of the database files is not what I'm looking for, as I
will be unable to restore Inno databases. Inno databases are stored in one
file, not separate files / directories as MyISAM.
  

I think I don't understand that.

Regards,
Ludovic Strappazon.

Regards,
Marc  


-Oorspronkelijk bericht-
Van: Jerome Massano [mailto:[EMAIL PROTECTED] 
Verzonden: woensdag 23 mei 2007 11:20

Aan: Marc
CC: bacula-users@lists.sourceforge.net
Onderwerp: Re: [Bacula-users] FW: Backup of a mysql dump of the database

Hi

You can do a database by database dump with mysqldump (see man mysqldump). I
suppose you could do what you want by setting multiple
jobs:

First job has :
Client Run Before Job = mysqldump database1  file-1 Fileset = file-1-fs
Client Run After Job = rm file-1

Second job has :
Client Run Before Job = mysqldump database2  file-2 Fileset = file-2-fs
Client Run After Job = rm file-2

etc... Of course you will have to configure file-1-fs and file-2-fs to
backup file-1 and file-2.

I think the syntax is false because I did it without looking at the
documentation, but the idea is here.

If you are using another database, i think there would be an equivalent to
mysqldump.

Hope that will help

PS : sorry for my really bad english ^^


Le mercredi 23 mai 2007 à 10:53 +0200, Marc a écrit :
  

Hi,
 
I've searched the archives, but I can not find any relevant information.
Therefor my question: is it possible to do a database by database dump 
and backup these dump files? Because of database sizes, it would be 
very nice if this can be done database by database. I mean first do 
the dump of database 1, move it to bacula, remove the dump, dump of 
database 2, move it to bacula, etc...
 
Can this be done?
 
Kind regards,

Marc



--
--- This SF.net email is sponsored by DB2 Express Download DB2 Express 
C - the FREE version of DB2 express and take control of your XML. No 
limits. Just data. Click to get it now.

http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users






-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

  


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Martin Simmons
 On Wed, 23 May 2007 12:28:20 +0200, Marc  said:
 
 I have to maintain a server park with 500+ databases, and databases being
 remove and added every day. So defining a job for each database is not an
 option.

You can probably build a solution based on fifos, but it will still be
complicated if you want to restore only 1 of the 500 databases.  See:

http://www.bacula.org/rel-manual/FileSet_Resource.html#readfifo

http://paramount.ind.wpi.edu/wiki/doku.php?id=application_specific_backups#postgresql

__Martin



 Creating a backup of the database files is not what I'm looking for, as I
 will be unable to restore Inno databases. Inno databases are stored in one
 file, not separate files / directories as MyISAM.
 
 Regards,
 Marc  
 
 -Oorspronkelijk bericht-
 Van: Jerome Massano [mailto:[EMAIL PROTECTED] 
 Verzonden: woensdag 23 mei 2007 11:20
 Aan: Marc
 CC: bacula-users@lists.sourceforge.net
 Onderwerp: Re: [Bacula-users] FW: Backup of a mysql dump of the database
 
 Hi
 
 You can do a database by database dump with mysqldump (see man mysqldump). I
 suppose you could do what you want by setting multiple
 jobs:
 
 First job has :
 Client Run Before Job = mysqldump database1  file-1 Fileset = file-1-fs
 Client Run After Job = rm file-1
 
 Second job has :
 Client Run Before Job = mysqldump database2  file-2 Fileset = file-2-fs
 Client Run After Job = rm file-2
 
 etc... Of course you will have to configure file-1-fs and file-2-fs to
 backup file-1 and file-2.
 
 I think the syntax is false because I did it without looking at the
 documentation, but the idea is here.
 
 If you are using another database, i think there would be an equivalent to
 mysqldump.
 
 Hope that will help
 
 PS : sorry for my really bad english ^^
 
 
 Le mercredi 23 mai 2007 à 10:53 +0200, Marc a écrit :
  Hi,
   
  I've searched the archives, but I can not find any relevant information.
  Therefor my question: is it possible to do a database by database dump 
  and backup these dump files? Because of database sizes, it would be 
  very nice if this can be done database by database. I mean first do 
  the dump of database 1, move it to bacula, remove the dump, dump of 
  database 2, move it to bacula, etc...
   
  Can this be done?
   
  Kind regards,
  Marc
  
  
  
  --
  --- This SF.net email is sponsored by DB2 Express Download DB2 Express 
  C - the FREE version of DB2 express and take control of your XML. No 
  limits. Just data. Click to get it now.
  http://sourceforge.net/powerbar/db2/
  ___
  Bacula-users mailing list
  Bacula-users@lists.sourceforge.net
  https://lists.sourceforge.net/lists/listinfo/bacula-users
 
 
 
 
 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users
 

-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Chris Hoogendyk


Ludovic Strappazon wrote:
 Marc a écrit :
 I have to maintain a server park with 500+ databases, and databases being
 remove and added every day. So defining a job for each database is not an
 option.
   
 I can't imagine a way to back them up one by one in a single job.
 Creating a backup of the database files is not what I'm looking for, as I
 will be unable to restore Inno databases. Inno databases are stored in one
 file, not separate files / directories as MyISAM.
   
 I think I don't understand that.

Check out the backup pages on the MySQL web site. They discuss these
issues and what the options are. Inno are clearly different than MyISAM.
They mention the options and pitfalls for each.




---

Chris Hoogendyk

-
   O__   Systems Administrator
  c/ /'_ --- Biology  Geology Departments
 (*) \(*) -- 140 Morrill Science Center
~~ - University of Massachusetts, Amherst 

[EMAIL PROTECTED]

--- 

Erdös 4



-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Ryan Novosielski
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Seems to me the MySQL backup script on the wiki may do this, but I could
be thinking of the svn backup script. Perhaps that one would help,
though, as I know it does go through one by one.

Seems to me this would be a pretty easy job though -- you first select
to see which databases are on the machine and use this list to loop
through and back each one up. No problem, right? I'm not sure, myself.

=R

Ludovic Strappazon wrote:
 Marc a écrit :
 I have to maintain a server park with 500+ databases, and databases being
 remove and added every day. So defining a job for each database is not an
 option.
   
 I can't imagine a way to back them up one by one in a single job.
 Creating a backup of the database files is not what I'm looking for, as I
 will be unable to restore Inno databases. Inno databases are stored in one
 file, not separate files / directories as MyISAM.
   
 I think I don't understand that.
 
 Regards,
 Ludovic Strappazon.
 Regards,
 Marc  

 -Oorspronkelijk bericht-
 Van: Jerome Massano [mailto:[EMAIL PROTECTED] 
 Verzonden: woensdag 23 mei 2007 11:20
 Aan: Marc
 CC: bacula-users@lists.sourceforge.net
 Onderwerp: Re: [Bacula-users] FW: Backup of a mysql dump of the database

 Hi

 You can do a database by database dump with mysqldump (see man mysqldump). I
 suppose you could do what you want by setting multiple
 jobs:

 First job has :
 Client Run Before Job = mysqldump database1  file-1 Fileset = file-1-fs
 Client Run After Job = rm file-1

 Second job has :
 Client Run Before Job = mysqldump database2  file-2 Fileset = file-2-fs
 Client Run After Job = rm file-2

 etc... Of course you will have to configure file-1-fs and file-2-fs to
 backup file-1 and file-2.

 I think the syntax is false because I did it without looking at the
 documentation, but the idea is here.

 If you are using another database, i think there would be an equivalent to
 mysqldump.

 Hope that will help

 PS : sorry for my really bad english ^^


 Le mercredi 23 mai 2007 à 10:53 +0200, Marc a écrit :
   
 Hi,
  
 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump 
 and backup these dump files? Because of database sizes, it would be 
 very nice if this can be done database by database. I mean first do 
 the dump of database 1, move it to bacula, remove the dump, dump of 
 database 2, move it to bacula, etc...
  
 Can this be done?
  
 Kind regards,
 Marc



 --
 --- This SF.net email is sponsored by DB2 Express Download DB2 Express 
 C - the FREE version of DB2 express and take control of your XML. No 
 limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users
 




 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users

   
 
 
 
 
 -
 This SF.net email is sponsored by DB2 Express
 Download DB2 Express C - the FREE version of DB2 express and take
 control of your XML. No limits. Just data. Click to get it now.
 http://sourceforge.net/powerbar/db2/
 
 
 
 
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users

- --
  _  _ _  _ ___  _  _  _
 |Y#| |  | |\/| |  \ |\ |  | |Ryan Novosielski - Systems Programmer III
 |$| |__| |  | |__/ | \| _| |[EMAIL PROTECTED] - 973/972.0922 (2-0922)
 \__/ Univ. of Med. and Dent.|IST/AST - NJMS Medical Science Bldg - C630
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.5 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGVFUlmb+gadEcsb4RAsrMAJ9ygTPP7DK4GVf/Tu9p+k/lrDj0dQCfd3Fm
9Z53HvjoyspxQnacDSJkOHc=
=RhKL
-END PGP SIGNATURE-



-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list

Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread Darien Hager
Marc wrote:
 I have to maintain a server park with 500+ databases, and databases being
 remove and added every day. So defining a job for each database is not an
 option.

 Creating a backup of the database files is not what I'm looking for, as I
 will be unable to restore Inno databases. Inno databases are stored in one
 file, not separate files / directories as MyISAM.
   
First off, you shouldn't need to define more than one job per machine. 
Your run-before-job script(s) should be plenty able to determine which 
databases to dump. Real quickly, this is how my experience has gone with 
PostgreSQL databases.

1. Higher ups have said no downtime is allowable, so I cannot simply 
shut down the database server, back up the raw working files, and 
restart it. I cannot back up these raw files while it is running because 
I would never get reliable data as it's changing them constantly.

2. There is a bit of a bug in the run-before-job stuff where if your 
scripting takes more than 30 minutes then the remainder of the job 
(actual data transfer, run-after-job) stuff does not run and the job is 
marked as an error. This has to due with certain connection timeouts 
between the FD and the SD. You could make a source change and recompile 
Bacula to increase this limit, but I didn't want to do that.

3. My set-up involves a few dozen machines each with anywhere from 50 to 
300 databases on them.

4. First I tried using FIFOs, but ran into various issues. I would not 
recommend them unless you are critically short on disk space on your 
servers, since they bring their own complications. (If you do decide to 
try FIFOs, let me know, I have some scripts from then which may be helpful.)


Here's what I do currently. I have a DB jobdef and a DBPREP jobdef. 
The DBPREP stuff uses a client-run-after-job script (to avoid that 30 
minute timeout issue) to actually do the dumping of the databases into 
files on the disk. The DB jobs have a higher Priority= number, so they 
run after all DBPREP jobs are done. DBPREP scripts change a little file 
called rval.dat every time they finish and that file stores whether 
the DBPREP job actually ran OK or not. The DB client-run-before-job 
scripts check that file before doing the real backup and cancel the DB 
job if the previous DBPREP didn't work out.

On full backups, the scripts I have on each client dump and compress 
each database into dbname.sql.gz, and then delete the older files. On 
differential or incremental days, the script dumps each database 
into dbname.MMDD.gz. Then, it computes a differential between that 
and the original dbname.sql.gz file. The differential is 
dbname.MMDD.diff Finally, it dbname.YYYMMDD.gz and keeps the 
differential. I use xdelta3 for generating diffs since the default 
diff tool tends to break very badly when handling large files--it uses 
too much memory.

--Darien Hager

-- 
Darien Hager
[EMAIL PROTECTED]


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] FW: Backup of a mysql dump of the database

2007-05-23 Thread David Romerstein
On Wed, 23 May 2007, Marc wrote:

 I've searched the archives, but I can not find any relevant information.
 Therefor my question: is it possible to do a database by database dump and
 backup these dump files? Because of database sizes, it would be very nice if
 this can be done database by database. I mean first do the dump of database
 1, move it to bacula, remove the dump, dump of database 2, move it to
 bacula, etc...

I would do this in a shell script, instead of just trying to schedule it 
within bacula. Something like:

---BEGIN---
#! /bin/bash
for THING in `mysql -u$USER -p$PASSWORD -e 'show databases;`
do mysqldump -u$USER -p$PASSWORD -l $THING  /path/to/temp/$THING.sql
/etc/bacula/bconsole -c /etc/bacula/bconsole.conf  END_OF_DATA
run job=database yes
END_OF_DATA
rm -rf /path/to/temp/$THING.sql
done
---END---

That's quick and dirty, and you'd certainly want to test it, but I think 
it illustrates the point. You'd have to create a job called 'database', 
with a fileset that pointed to the directory you're dumping the files to, 
and you might want to think about dumping tables individually through gzip 
to save space. If there are databases that don't need to be backed up (the 
'mysql' and 'lost+found' dbs come to mind), you might need to do some 
parsing of the 'show databases' output (and I'm not sure how to supress 
the output of the column name).

I have a single 650GB MySQL 4.1 db that I do a full backup of every two 
weeks. Takes about 19 hours to dump all of the tables and gzip them, and 
the backups end up around 36GB.

-- D


-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users