John Drescher napsal(a):
>> I have no problem with backing up catalog on other server with less
>> expensive storage space. But I do not see any use of old catalog backups at
>> all. I would do "bscan into new catalog" in all cases except the one, where
>> I have catalog backup newer than last data backed up and I need to recover
>> it. I.e. database server crash or something like that. In all other cases I
>> would not risk the possible complications...
>>
> A couple of years ago I had a database corruption issue where the
> database had become corrupt and I did not notice that for over 1 week.
> The catalog backup and some other backups were running fine but others
> were failing. If I did not have more than 1 week of catalogs it would
> have been much harder to recover being that 200 volumes or so and
> manually scanning to recover the catalog would have taken months. One
> other thing I recommend. Always have a bootstrap file for your
> catalogs. I have mine named so that each day I get a new bootstrap
> file and that file has the jobid in its name.
> 
> 
> Job {
>   Name = "BackupCatalog"
>   Type = Backup
>   Client = dev6-fd
>   FileSet = "Catalog"
>   Schedule = "WeeklyCycleAfterBackup"
>   Storage = File
>   Messages = Standard
>   Pool = BackupCatalogs
>   ClientRunBeforeJob = "/usr/libexec/bacula/make_catalog_backup bacula
> hbroker hbroker"
>   ClientRunAfterJob = /usr/libexec/bacula/delete_catalog_backup
>   WriteBootstrap = "/mnt/vg/backups/BootStrap/%n_%i.bsr"
> }
> 
> 
> John

I keep the bootstrap file too, and I think that this is a good idea.
I am using only disk backups and all jobs are stored in separate files. 
Therefore I thought that bscan should not take significantly more time 
than reading all backup files once.
I tried this as I've tested Bacula. I used one machine's backup file, 
"empty" machine, livecd with bacula binaries and nothing else. I scanned 
the file into the new catalog and restored the machine. The scanning has 
been reasonably fast.
We have a few hundreds gigs of backups and with cca 50MB/s it should 
scan like 100GB in half an hour. Thats why I think scanning all backups 
would be more reasonable than using old catalog...

Radek

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to