Dear List :)
I'd like to migrate my jobs into separate pools. I had one default pool
for testing, and I have completed a number of diffrent backups, all to
my satisfaction. I have configured three extra pools, full, diff, and
inc (very original, I know). I wish to migrate all my existing full,
Hello
I have some problems with mysql or svn backup. I will explain this on MySQL
backup:
In Job declaration i have Client Run Before Job = .../mysql_dump.sh (backup
solution: http://wiki.bacula.org/doku.php?id=application_specific_backups:mysql)
When i run backup job (Level: full) this script
Maybe the time stamp on the file changed.
---Guy
(via iPhone)
On 30 Jan 2011, at 11:00, aiv bacula-fo...@backupcentral.com wrote:
Hello
I have some problems with mysql or svn backup. I will explain this on MySQL
backup:
In Job declaration i have Client Run Before Job = .../mysql_dump.sh
Short question -- can a running Bacula Console dump out what it thinks
my configuration is?
Long question -- my Bacula installation is acting a little weird.
I'm using File Storage. A couple of bacups didn't have the right name,
and one of those didn't auto-label the backup file.
Most of my
On 01/30/11 06:51, Mister IT Guru wrote:
Dear List :)
I'd like to migrate my jobs into separate pools. I had one default pool
for testing, and I have completed a number of diffrent backups, all to
my satisfaction. I have configured three extra pools, full, diff, and
inc (very original, I
Hello everyone,
I have got a Bacula configured and running for quite a while now, without
any major issues. However, as I keep on adding client nodes, more and more
often I am running into situation when some jobs are waiting for the other
ones to finish.
The configuration I use is as follows:
Oh I guess I can answer this one myself already.
Creating dedicated Storages for Pools should just do the trick.
If anyone knows better solution please do speak.
On 30 January 2011 19:18, Bart Swedrowski b...@timedout.org wrote:
Hello everyone,
I have got a Bacula configured and running for
Hi Koshi
I seem to have a similar problem (in bacula 3.0.3). After purging the volume,
it indeed has no more associated jobs as may be verified in the JobMedia table
(select * from JobMedia where MediaId = ID). However, the volume is not emptied
and upon reuse it seems to be full already.
On 30/01/2011 18:03, Phil Stracchino wrote:
On 01/30/11 06:51, Mister IT Guru wrote:
Dear List :)
I'd like to migrate my jobs into separate pools. I had one default pool
for testing, and I have completed a number of diffrent backups, all to
my satisfaction. I have configured three extra
I'm looking for help with an LTO4 tape writing problem.
btape:
tape 1 : fine, but ran at about 55MB/s
tape 2 : failed after a few 100GB
tape 3 : failed after a few 100GB
tape 4 : failed after hours of running at only 10MB/s
JobId 0: Fatal error: Unable to write EOF.
On 1/28/2011 6:24 PM, Marc Dojka wrote:
Hi all,
I think I already have the answer, but wanted to double check. It's not
possible to have the private key for data encryption password protected,
correct. Thanks.
What concern are you trying to resolve by having some kind of encryption?
--
For the backups: The media is stored at an offsite location. When the
media leaves my control, all data must be encrypted. This is for policy
reasons, insurance reasons, and ensures confidentiality of customer
information as well as HR records.
For the keys: So even if both the backups and
On 1/30/2011 8:33 PM, Marc Dojka wrote:
On Sun, Jan 30, 2011 at 8:22 PM, Dan Langille d...@langille.org
mailto:d...@langille.org wrote:
On 1/28/2011 6:24 PM, Marc Dojka wrote:
Hi all,
I think I already have the answer, but wanted to double check.
It's not
On Sun, Jan 30, 2011 at 12:59 PM, hymie! hy...@lactose.homelinux.net wrote:
Short question -- can a running Bacula Console dump out what it thinks
my configuration is?
Long question -- my Bacula installation is acting a little weird.
I'm using File Storage. A couple of bacups didn't have
14 matches
Mail list logo