Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Gary R. Schmidt

On 17/10/2017 02:01, Adam Weremczuk wrote:


Our Bacula db is now over 13GB in size and I feel at least half of it is 
junk.


That is a microscopic database in Bacula terms, mine stabilised at 
around 90Gb with a retention of 12 months, and there are many Bacula 
systems with databases in the hundreds of gigabytes.


Just relax, if Bacula itself is backing up and restoring files 
everything is working, there is no need to start checking things unless 
some problem happens.


Cheers,
GaryB-)

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Alan Brown

On 16/10/17 16:01, Adam Weremczuk wrote:

Hi Radoslaw,

Our Bacula db is now over 13GB in size and I feel at least half of it 
is junk.

Most likely it also suffers a performance penalty.



Switch to postgresql asap. You will breathe a LOT easier once you've 
done that.


MySQL is good at what it does and ok for small bacula installations but 
it DOES NOT SCALE WELL.



More reason for keeping it clean: backups, upgrades, migration to a 
different server etc.


I don't know how to safely and efficiently perform this clean up from 
hand.

That's why I'm asking experts :)



use the dbcheck utility to cleanup the database. That's what it's there 
for (make sure nothing else is running!)





Regards
Adam

On 16/10/2017 15:49, Radosław Korzeniewski wrote:

Hello,

2017-10-16 15:33 GMT+02:00 Adam Weremczuk >:


Does it mean the only option is throwing this database into a bin?


Orphaned records (especially Path records) are no harmful to Bacula 
at all and there is no any requirement that it should be removed. 
They only take space in database and could be reused in the future if 
the same path comes in backup.


No workaround really?


You can remove them manually if you know how to do this.

best regards
--
Radosław Korzeniewski
rados...@korzeniewski.net 




--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot


___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users



--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Adam Weremczuk

Hi Radoslaw,

Our Bacula db is now over 13GB in size and I feel at least half of it is 
junk.

Most likely it also suffers a performance penalty.

More reason for keeping it clean: backups, upgrades, migration to a 
different server etc.


I don't know how to safely and efficiently perform this clean up from hand.
That's why I'm asking experts :)

Regards
Adam

On 16/10/2017 15:49, Radosław Korzeniewski wrote:

Hello,

2017-10-16 15:33 GMT+02:00 Adam Weremczuk >:


Does it mean the only option is throwing this database into a bin?


Orphaned records (especially Path records) are no harmful to Bacula at 
all and there is no any requirement that it should be removed. They 
only take space in database and could be reused in the future if the 
same path comes in backup.


No workaround really?


You can remove them manually if you know how to do this.

best regards
--
Radosław Korzeniewski
rados...@korzeniewski.net 


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Radosław Korzeniewski
Hello,

2017-10-16 15:33 GMT+02:00 Adam Weremczuk :

> Does it mean the only option is throwing this database into a bin?
>

Orphaned records (especially Path records) are no harmful to Bacula at all
and there is no any requirement that it should be removed. They only take
space in database and could be reused in the future if the same path comes
in backup.


> No workaround really?
>

You can remove them manually if you know how to do this.

best regards
-- 
Radosław Korzeniewski
rados...@korzeniewski.net
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Adam Weremczuk

Hi Heitor,

Thank you for a quick reply.
We have plans of moving Bacula to a dedicated box, new tape drive and 
install the latest version.

That's unfortunately unlikely to happen before Xmas.


And then did you kill the MySQL thread? Perhaps you should have waited.


I don't know exactly what happened then but my guess is the process was 
abruptly killed.

This is just a documented story I have inherited from the previous sysadmin.


In 2016 the service didn't stop working but an email report returned:

Checking for orphaned Path entries. This may take some time!
Query failed: SELECT DISTINCT Path.PathId,File.PathId FROM Path LEFT
OUTER JOIN File ON (Path.PathId=File.PathId) WHERE File.PathId IS
NULL LIMIT 30: ERR=Server shutdown in progress

I think the MySQL thread was killed while it was still processing, and 
then this error was printed .


Our MySQL is not that ancient, it's 5.5.31.
I think it reached 300k limit and terminated gracefully, not leading to 
a db corruption.

It was an automated process running over Xmas when the office was empty.
Again - I wasn't present when it happened.


I think this is a dbcheck limit.


Does it mean the only option is throwing this database into a bin?
No workaround really?

Regards
Adam
--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Heitor Faria
> Hello,

Hello, Adam, 

> I'm running Bacula 5.2.6 on Debian linux.
You are strongly advised to move to earlier versions. 

> A dry run of dbcheck has revealed the following.
> dbcheck -v -c /etc/bacula/bacula-dir.conf
> Hello, this is the database check/correct program.
> Modify database is off. Verbose is on.
> Please select the function you want to perform.

>  1) Toggle modify database flag
>  2) Toggle verbose flag
>  3) Check for bad Filename records
>  4) Check for bad Path records
>  5) Check for duplicate Filename records
>  6) Check for duplicate Path records
>  7) Check for orphaned Jobmedia records
>  8) Check for orphaned File records
>  9) Check for orphaned Path records
> 10) Check for orphaned Filename records
> 11) Check for orphaned FileSet records
> 12) Check for orphaned Client records
> 13) Check for orphaned Job records
> 14) Check for all Admin records
> 15) Check for all Restore records
> 16) All (3-15)
> 17) Quit
> Select function number: 16

> Checking for Filenames with a trailing slash
> Found 0 bad Filename records.
> Checking for Paths without a trailing slash
> Found 0 bad Path records.
> Checking for duplicate Filename entries.
> Found 0 duplicate Filename records.
> Checking for duplicate Path entries.
> Found 0 duplicate Path records.
> Checking for orphaned JobMedia entries.
> Checking for orphaned File entries. This may take some time!
> Note. Index over the PathId column not found, that can greatly slow down
> dbcheck.
> Create temporary index? (yes/no): yes
> Create temporary index... This may take some time!
> CREATE INDEX idxPIchk ON File (PathId)
> Temporary index created.
> Checking for orphaned Path entries. This may take some time!
> Found 30 orphaned Path records.
> Print them? (yes/no): no
> Drop temporary index.
> DROP INDEX idxPIchk ON File
> Temporary index idxPIchk deleted.
> Note. Index over the FilenameId column not found, that can greatly slow down
> dbcheck.
> Create temporary index? (yes/no): yes
> Create temporary index... This may take some time!
> CREATE INDEX idxFIchk ON File (FilenameId)
> Temporary index created.
> Checking for orphaned Filename entries. This may take some time!
> Found 30 orphaned Filename records.
> Print them? (yes/no): no
> Drop temporary index.
> DROP INDEX idxFIchk ON File
> Temporary index idxFIchk deleted.
> Checking for orphaned FileSet entries. This takes some time!
> Found 12 orphaned FileSet records.
> Print them? (yes/no): no
> Checking for orphaned Client entries.
> Found 0 orphaned Client records.
> Checking for orphaned Job entries.
> Found 31 orphaned Job records.
> Print them? (yes/no): no
> Checking for Admin Job entries.
> Found 0 Admin Job records.
> Checking for Restore Job entries.
> Found 23 Restore Job records.
> Print them? (yes/no): no

> A yearly job is scheduled to run "dbcheck -c /etc/bacula/bacula-dir.conf -f 
> -b"
> every December.

> This lead to a db corruption in December 2015. "Mysqladmin processlist" 
> revealed
> one INSERT job with a state "Waiting for table level lock" followed by SELECT
> stuck with "executing".
> A series of check and repair table command made Bacula run again but the 
> problem
> with excessive number of records remained.

And then did you kill the MySQL thread? Perhaps you should have waited. 

> In 2016 the service didn't stop working but an email report returned:

> Checking for orphaned Path entries. This may take some time!
> Query failed: SELECT DISTINCT Path.PathId,File.PathId FROM Path LEFT
> OUTER JOIN File ON (Path.PathId=File.PathId) WHERE File.PathId IS
> NULL LIMIT 30: ERR=Server shutdown in progress

I think the MySQL thread was killed while it was still processing, and then 
this error was printed . 

> So I'm bracing myself for 2017.
> I'm planning to take a full db dump beforehand and run dbcheck manually under
> supervision which I'm expecting to fail again :(

> Any advise on how to resolve the 300k record issue?

I think this is a dbcheck limit. 

> Is it a good idea to increase this limit in the database?
> Or maybe a different (and fairly safe) SQL command that can fix it before
> dbcheck is executed?

> Thanks
> Adam

Regards. 
-- 
=== 
Heitor Medrado de Faria | CEO Bacula do Brasil | Visto EB-1 | LPIC-III | EMC 
05-001 | ITIL-F 
• Não seja tarifado pelo tamanho dos seus backups, conheça o Bacula Enterprise: 
http://www.bacula.com.br/enterprise/ 
• Ministro treinamento e implementação in-company do Bacula Community: 
http://www.bacula.com.br/in-company/ 
+55 61 98268-4220 | www.bacula.com.br 
 
Indicamos também as capacitações complementares: 
• Shell básico e Programação em Shell com Julio Neves. 
• Zabbix com Adail Host. 
===

[Bacula-users] 300k+ orphaned file and path records

2017-10-16 Thread Adam Weremczuk

Hello,

I'm running Bacula 5.2.6 on Debian linux.

A dry run of dbcheck has revealed the following.

dbcheck -v -c /etc/bacula/bacula-dir.conf
Hello, this is the database check/correct program.
Modify database is off. Verbose is on.
Please select the function you want to perform.

 1) Toggle modify database flag
 2) Toggle verbose flag
 3) Check for bad Filename records
 4) Check for bad Path records
 5) Check for duplicate Filename records
 6) Check for duplicate Path records
 7) Check for orphaned Jobmedia records
 8) Check for orphaned File records
 9) Check for orphaned Path records
10) Check for orphaned Filename records
11) Check for orphaned FileSet records
12) Check for orphaned Client records
13) Check for orphaned Job records
14) Check for all Admin records
15) Check for all Restore records
16) All (3-15)
17) Quit
Select function number: 16
Checking for Filenames with a trailing slash
Found 0 bad Filename records.
Checking for Paths without a trailing slash
Found 0 bad Path records.
Checking for duplicate Filename entries.
Found 0 duplicate Filename records.
Checking for duplicate Path entries.
Found 0 duplicate Path records.
Checking for orphaned JobMedia entries.
Checking for orphaned File entries. This may take some time!
Note. Index over the PathId column not found, that can greatly slow down
dbcheck.
Create temporary index? (yes/no): yes
Create temporary index... This may take some time!
CREATE INDEX idxPIchk ON File (PathId)
Temporary index created.
Checking for orphaned Path entries. This may take some time!
Found 30 orphaned Path records.
Print them? (yes/no): no
Drop temporary index.
DROP INDEX idxPIchk ON File
Temporary index idxPIchk deleted.
Note. Index over the FilenameId column not found, that can greatly slow down
dbcheck.
Create temporary index? (yes/no): yes
Create temporary index... This may take some time!
CREATE INDEX idxFIchk ON File (FilenameId)
Temporary index created.
Checking for orphaned Filename entries. This may take some time!
Found 30 orphaned Filename records.
Print them? (yes/no): no
Drop temporary index.
DROP INDEX idxFIchk ON File
Temporary index idxFIchk deleted.
Checking for orphaned FileSet entries. This takes some time!
Found 12 orphaned FileSet records.
Print them? (yes/no): no
Checking for orphaned Client entries.
Found 0 orphaned Client records.
Checking for orphaned Job entries.
Found 31 orphaned Job records.
Print them? (yes/no): no
Checking for Admin Job entries.
Found 0 Admin Job records.
Checking for Restore Job entries.
Found 23 Restore Job records.
Print them? (yes/no): no

A yearly job is scheduled to run "dbcheck -c /etc/bacula/bacula-dir.conf -f -b" 
every December.

This lead to a db corruption in December 2015.
"Mysqladmin processlist" revealed one INSERT job with a state "Waiting for table level 
lock" followed by SELECT stuck with "executing".
A series of check and repair table command made Bacula run again but the 
problem with excessive number of records remained.

In 2016 the service didn't stop working but an email report returned:

Checking for orphaned Path entries. This may take some time!
Query failed: SELECT DISTINCT Path.PathId,File.PathId FROM Path LEFT
OUTER JOIN File ON (Path.PathId=File.PathId) WHERE File.PathId IS
NULL LIMIT 30: ERR=Server shutdown in progress

So I'm bracing myself for 2017.
I'm planning to take a full db dump beforehand and run dbcheck manually under 
supervision which I'm expecting to fail again :(

Any advise on how to resolve the 300k record issue?
Is it a good idea to increase this limit in the database?
Or maybe a different (and fairly safe) SQL command that can fix it before 
dbcheck is executed?

Thanks
Adam

--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users