Hi everyone,
I have been having problems with an always incremental backup that used
to run fine with the same hardware and configuration in version 16.2.
The first time the job is run it is promoted to full. The director's
messages inform that the job finished with status OK (job database shows
job status 'T' for that particular job).
The next time the job is run the director complains with:
'Prior failed job found in catalog. Upgrading to Full.
I don't know why this is happening. Unfortunately the logged message
doesn't show the JobId of the supposedly failed job.
Bareos Server 19.2.6 is installed in a jail in a FreeNAS (FreeBSD 11.3)
and uses PostgreSQL 11 running on another jail. I deleted and
reinstalled everything in those two jails to rule out upgrade problems.
I even compiled Bareos 19.2.6 in the server to be sure that everything
was up to date. I deleted the old bareos database and created a new one
with the scripts provided. Using bconsole, I disabled all the clients
but one and also disabled all the jobs for that particular client but
one. When I ran everything in a clean install with one client and one
job, I get the same error.
I reported this problem as a bug for version 17.2, but it is still an
issue. I noticed other people having problems with AI jobs also.
https://bugs.bareos.org/view.php?id=958
Any idea of what might be causing this? Each time this job runs it takes
30GB and I end up with no space in the NAS server very quickly.
** Update:
I think I found what the problem is and I think it is a bug.
Running a very reduced test file set with only one client enabled the
job completed and the next runs were run as incremental. Same client,
same configuration as the one failing above except the reduced file set.
I erased all the data and recreated the database to start fresh and I
managed to force the error by asking the director to run multiple times
the same job while the first backup was still running, thus forcing the
cancellations (job has Cancel Queued Duplicates = yes). Those canceled
jobs seems to 'confuse' the director somehow:
Terminated Jobs:
JobId Level Files Bytes Status Finished Name
====================================================================
4 Full 0 0 Cancel 17-Feb-20 16:30
marcelo-qosmio-backup-data
5 Full 0 0 Cancel 17-Feb-20 16:30
marcelo-qosmio-backup-data
6 Full 0 0 Cancel 17-Feb-20 16:30
marcelo-qosmio-backup-data
1 Full 8 250.6 M OK 17-Feb-20 16:30
marcelo-qosmio-backup-data
7 Full 8 250.6 M OK 17-Feb-20 16:31
marcelo-qosmio-backup-data
8 Incr 0 0 OK 17-Feb-20 16:31
marcelo-qosmio-backup-data
10 Incr 0 0 Cancel 17-Feb-20 16:34
marcelo-qosmio-backup-data
11 Incr 0 0 Cancel 17-Feb-20 16:34
marcelo-qosmio-backup-data
9 Incr 2 581.4 M OK 17-Feb-20 16:35
marcelo-qosmio-backup-data
12 Incr 2 5.581 M OK 17-Feb-20 16:36
marcelo-qosmio-backup-data
Here is the log:
17-Feb 16:30 freenas-bareos-sd JobId 1: Releasing device
"file-device-cons-01" (/mnt/bareos-storage).
17-Feb 16:30 freenas-bareos-sd JobId 1: Elapsed time=00:00:22, Transfer
rate=11.39 M Bytes/second
17-Feb 16:30 freenas-bareos-sd JobId 1: Sending spooled attrs to the
Director. Despooling 2,363 bytes ...
17-Feb 16:30 freenas-bareos-dir JobId 1: Insert of attributes batch
table with 8 entries start
17-Feb 16:30 freenas-bareos-dir JobId 1: Insert of attributes batch
table done
17-Feb 16:30 freenas-bareos-dir JobId 1: Bareos freenas-bareos-dir
19.2.6 (11Feb20):
Build OS: FreeBSD-11.3-RELEASE-p5 freebsd 11.3-RELEASE-p5
JobId: 1
Job: marcelo-qosmio-backup-data.2020-02-17_16.30.00_03
Backup Level: Full (upgraded from Incremental)
Client: "marcelo-qosmio-fd" 19.2.6 (11Feb20)
Linux-3.10.0-1062.9.1.el7.x86_64,ubuntu,Ubuntu 18.04
LTS,xUbuntu_18.04,x86_64
FileSet: "marcelo-test-fileset" 2020-02-17 16:30:00
Pool: "marcelo-qosmio-data-cons-pool" (From Job
FullPool override)
Catalog: "freebird-catalog" (From Client resource)
Storage: "freenas-cons-storage" (From Pool resource)
Scheduled time: 17-Feb-2020 16:30:00
Start time: 17-Feb-2020 16:30:33
End time: 17-Feb-2020 16:30:57
Elapsed time: 24 secs
Priority: 20
FD Files Written: 8
SD Files Written: 8
FD Bytes Written: 250,609,708 (250.6 MB)
SD Bytes Written: 250,610,671 (250.6 MB)
Rate: 10442.1 KB/s
Software Compression: None
VSS: no
Encryption: no
Accurate: yes
Volume name(s): 2020-02-17_16:30:03_marcelo-qosmio-data-cons-pool_J:1_V:0
Volume Session Id: 1
Volume Session Time: 1581974996
Last Volume Bytes: 250,797,540 (250.7 MB)
Non-fatal FD errors: 0
SD Errors: 0
FD termination status: OK
SD termination status: OK
Bareos binary info: self-compiled: Get official binaries and
vendor support on bareos.com
Termination: Backup OK
17-Feb 16:30 freenas-bareos-dir JobId 1: Begin pruning Jobs older than 6
months .
17-Feb 16:30 freenas-bareos-dir JobId 1: No Jobs found to prune.
17-Feb 16:30 freenas-bareos-dir JobId 1: Begin pruning Files.
17-Feb 16:30 freenas-bareos-dir JobId 1: No Files found to prune.
17-Feb 16:30 freenas-bareos-dir JobId 1: End auto prune.
17-Feb 16:31 freenas-bareos-dir JobId 7: Prior failed job found in
catalog. Upgrading to Full.
17-Feb 16:31 freenas-bareos-dir JobId 7: Start Backup JobId 7,
Job=marcelo-qosmio-backup-data.2020-02-17_16.31.09_10
17-Feb 16:31 freenas-bareos-dir JobId 7: Connected Storage daemon at
sd.bareos.freebird.dynu.net:9103, encryption: TLS_CHACHA20_POLY1305_SHA256
17-Feb 16:31 freenas-bareos-dir JobId 7: Created new Volume
"2020-02-17_16:31:12_marcelo-qosmio-data-cons-pool_J:7_V:1" in catalog.
17-Feb 16:31 freenas-bareos-dir JobId 7: Using Device
"file-device-cons-01" to write.
17-Feb 16:31 freenas-bareos-dir JobId 7: Using Client Initiated
Connection (marcelo-qosmio-fd).
17-Feb 16:31 freenas-bareos-dir JobId 7: Handshake: Immediate TLS
17-Feb 16:31 marcelo-qosmio-fd JobId 7: Connected Storage daemon at
sd.bareos.freebird.dynu.net:9103, encryption: TLS_CHACHA20_POLY1305_SHA256
17-Feb 16:31 marcelo-qosmio-fd JobId 7: Extended attribute support is
enabled
17-Feb 16:31 marcelo-qosmio-fd JobId 7: ACL support is enabled
17-Feb 16:31 freenas-bareos-sd JobId 7: Labeled new Volume
"2020-02-17_16:31:12_marcelo-qosmio-data-cons-pool_J:7_V:1" on device
"file-device-cons-01" (/mnt/bareos-storage).
17-Feb 16:31 freenas-bareos-sd JobId 7: Wrote label to prelabeled Volume
"2020-02-17_16:31:12_marcelo-qosmio-data-cons-pool_J:7_V:1" on device
"file-device-cons-01" (/mnt/bareos-storage)
17-Feb 16:31 freenas-bareos-dir JobId 7: Max Volume jobs=1 exceeded.
Marking Volume
"2020-02-17_16:31:12_marcelo-qosmio-data-cons-pool_J:7_V:1" as Used.
17-Feb 16:31 freenas-bareos-sd JobId 7: Releasing device
"file-device-cons-01" (/mnt/bareos-storage).
17-Feb 16:31 freenas-bareos-sd JobId 7: Elapsed time=00:00:22, Transfer
rate=11.39 M Bytes/second
17-Feb 16:31 freenas-bareos-sd JobId 7: Sending spooled attrs to the
Director. Despooling 2,363 bytes ...
17-Feb 16:31 freenas-bareos-dir JobId 7: Insert of attributes batch
table with 8 entries start
17-Feb 16:31 freenas-bareos-dir JobId 7: Insert of attributes batch
table done
17-Feb 16:31 freenas-bareos-dir JobId 7: Bareos freenas-bareos-dir
19.2.6 (11Feb20):
Build OS: FreeBSD-11.3-RELEASE-p5 freebsd 11.3-RELEASE-p5
JobId: 7
Job: marcelo-qosmio-backup-data.2020-02-17_16.31.09_10
Backup Level: Full (upgraded from Full)
Client: "marcelo-qosmio-fd" 19.2.6 (11Feb20)
Linux-3.10.0-1062.9.1.el7.x86_64,ubuntu,Ubuntu 18.04
LTS,xUbuntu_18.04,x86_64
FileSet: "marcelo-test-fileset" 2020-02-17 16:30:00
Pool: "marcelo-qosmio-data-cons-pool" (From Job
FullPool override)
Catalog: "freebird-catalog" (From Client resource)
Storage: "freenas-cons-storage" (From Pool resource)
Scheduled time: 17-Feb-2020 16:31:07
Start time: 17-Feb-2020 16:31:12
End time: 17-Feb-2020 16:31:35
Elapsed time: 23 secs
Priority: 20
FD Files Written: 8
SD Files Written: 8
FD Bytes Written: 250,609,708 (250.6 MB)
SD Bytes Written: 250,610,671 (250.6 MB)
Rate: 10896.1 KB/s
Software Compression: None
VSS: no
Encryption: no
Accurate: yes
Volume name(s): 2020-02-17_16:31:12_marcelo-qosmio-data-cons-pool_J:7_V:1
Volume Session Id: 2
Volume Session Time: 1581974996
Last Volume Bytes: 250,797,540 (250.7 MB)
Non-fatal FD errors: 0
SD Errors: 0
FD termination status: OK
SD termination status: OK
Bareos binary info: self-compiled: Get official binaries and
vendor support on bareos.com
Termination: Backup OK
17-Feb 16:31 freenas-bareos-dir JobId 7: Begin pruning Jobs older than 6
months .
17-Feb 16:31 freenas-bareos-dir JobId 7: No Jobs found to prune.
17-Feb 16:31 freenas-bareos-dir JobId 7: Begin pruning Files.
17-Feb 16:31 freenas-bareos-dir JobId 7: No Files found to prune.
17-Feb 16:31 freenas-bareos-dir JobId 7: End auto prune.
Update 2:
I decided trying with the first AI job that caused problems for me by
running it manually alone and waiting it to complete. After that,
subsequent runs are indeed incremental, so this is good workaround.
I do hope this gets fixed at some point.
Thanks!
Marcelo
--
You received this message because you are subscribed to the Google Groups
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/bareos-users/c7bb2524-a7a1-e28e-737c-7184b4f18f7c%40gmail.com.