Greetings!

I'm hopeful that someone can give us a little help. We currently run Bacula 
2.4.4. We have backups run on two storage devices, and each device has its own 
spool file directory. We've created a script that will check the spool 
directory at the start of each job to make sure that it's empty, since we've 
had a few issues with Bacula not deleting files due to crashes.

The issue comes in when we have to run a job on the other drive. For example, 
twice a year we run full backups of all of our file sets to be stored offsite. 
During this period, jobs that would normally run on our second tape drive are 
manually run on the primary drive instead, so that the second drive can be 
devoted to the offsite jobs. Unfortunately, the spool check script doesn't know 
about the storage change, so it will end up clobbering a valid spool file for 
the secondary drive, instead of checking the directory for the primary drive.

I've tried to find a variable to pass to the "Run Before Job" command to tell 
it the actual current storage, but have had no luck (${Storage} doesn't expand 
during the call). Does anyone know of a way to pass the storage to the script 
(without using Python)?

Thank You,
Tom

+----------------------------------------------------------------------
|This was sent by tomisom.s...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+----------------------------------------------------------------------



------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to