Hi, We've had a couple of threads about this before, but I just tried it now with Bacula 5.0.2 and just wanted to report my technique. Comments are welcome.
Here's what I did: 1) Wrote a new pool definition: Pool { Name = ArchiveMay2010 Pool Type = Backup Volume Retention = 11 months Job Retention = 11 months File Retention = 11 months Recycle Pool = Scratch } Tapes will come out of Scratch. Plan to make a new full archive every 5mo, so these tapes will be forgotten after two more full archives are done. 2) Wrote a new schedule for a one-time run: Schedule { Name = ArchiveMay2010Schedule Run = May 14 at 14:10 } 3) Wrote a new JobDefs JobDefs { Name = "ArchiveMay2010DefJob" Type = Backup Level = Full Accurate = no Client = bac.genomics.upenn.edu-fd #FileSet -> define in job Pool = ArchiveMay2010 Schedule = ArchiveMay2010Schedule Storage = ts Spool Data = yes Spool Attributes = yes Allow Duplicate Jobs = no Priority = 5 Messages = Standard } 4) Wrote a script to generate a bunch of job configs, in this case I want to archive each user's homedir. #!/bin/bash USERS=`getent passwd|cut -f 1 -d ':'|tail -n 111|xargs echo` for i in $USERS do cat > $i-archiveMay2010.conf <<ASDF #config file for $i job { name = "$i-archiveMay2010" jobdefs = ArchiveMay2010DefJob fileset = "$i-files" } fileset { name = "$i-files" include { options { onefs = "yes" signature = "md5" } file = "/gpfs/fs0/u/$i/" } exclude { } } ASDF done 5) run that script to generate the files and then include those configs: #include these archive jobs @|"/bin/sh -c '/bin/cat /opt/bacula/etc/jobs/May2010Archive/*-archiveMay2010.conf'" Also make sure your configs are in the right files, then "reload" in bconsole. Regards, -- Alex Chekholko ch...@genomics.upenn.edu ------------------------------------------------------------------------------ _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users