Re: [gpfsug-discuss] mmbackup file selections

2022-02-07 Thread Alec
I'll share something we do when working with the GPFS policy engine so we
don't blow out our backups...

So we use a different backup in solution and have our file system broken
down into multiple concurrent streams.

In my policy engine when making major changes to the file system such as
encrypting or compressing data I use a where clause such as:
  MOD(INODE, 7)<=dayofweek

When we call mmpolicy I add -M dayofweek=NN.

In this case I'd use cron and pass day of the week.

What this achieves is that on each day I only work on 1/7th of each file
system... So that no one backup stream is blown out.  It is cumulative so
7+ will work on 100% of the file system.

It's a nifty trick so figured I'd share it out.

In production we do something more like 40, and set shares to increment by
1 on weekdays and 3 on weekends to distribute workload out over the whole
month with more work on the weekends.

Alec

On Mon, Feb 7, 2022, 8:39 AM Paul Ward  wrote:

> Backups seem to have settled down.
> A workshop with our partner and IBM is in the pipeline.
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk
>
>
> -Original Message-
> From: gpfsug-discuss-boun...@spectrumscale.org <
> gpfsug-discuss-boun...@spectrumscale.org> On Behalf Of Paul Ward
> Sent: 01 February 2022 12:28
> To: gpfsug main discussion list 
> Subject: Re: [gpfsug-discuss] mmbackup file selections
>
> Not currently set. I'll look into them.
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk
>
>
> -Original Message-
> From: gpfsug-discuss-boun...@spectrumscale.org <
> gpfsug-discuss-boun...@spectrumscale.org> On Behalf Of Skylar Thompson
> Sent: 26 January 2022 16:50
> To: gpfsug main discussion list 
> Subject: Re: [gpfsug-discuss] mmbackup file selections
>
> Awesome, glad that you found them (I missed them the first time too).
>
> As for the anomalous changed files, do you have these options set in your
> client option file?
>
> skipacl yes
> skipaclupdatecheck yes
> updatectime yes
>
> We had similar problems where metadata and ACL updates were interpreted as
> data changes by mmbackup/dsmc.
>
> We also have a case open with IBM where mmbackup will both expire and
> backup a file in the same run, even in the absence of mtime changes, but
> it's unclear whether that's program error or something with our
> include/exclude rules. I'd be curious if you're running into that as well.
>
> On Wed, Jan 26, 2022 at 03:55:48PM +, Paul Ward wrote:
> > Good call!
> >
> > Yes they are dot files.
> >
> >
> > New issue.
> >
> > Mmbackup seems to be backup up the same files over and over without them
> changing:
> > areas are being backed up multiple times.
> > The example below is a co-resident file, the only thing that has changed
> since it was created 20/10/21, is the file has been accessed for backup.
> > This file is in the 'changed' list in mmbackup:
> >
> > This list has just been created:
> > -rw-r--r--. 1 root root 6591914 Jan 26 11:12
> > mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> >
> > Listing the last few files in the file (selecting the last one)
> > 11:17:52 [root@scale-sk-pn-1 .mmbackupCfg]# tail
> > mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604556977.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557039.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557102.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557164.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557226.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557288.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557351.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557413.png"
> >
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban

Re: [gpfsug-discuss] mmbackup file selections

2022-02-07 Thread Paul Ward
Backups seem to have settled down.
A workshop with our partner and IBM is in the pipeline.


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk


-Original Message-
From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of Paul Ward
Sent: 01 February 2022 12:28
To: gpfsug main discussion list 
Subject: Re: [gpfsug-discuss] mmbackup file selections

Not currently set. I'll look into them.


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk


-Original Message-
From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of Skylar Thompson
Sent: 26 January 2022 16:50
To: gpfsug main discussion list 
Subject: Re: [gpfsug-discuss] mmbackup file selections

Awesome, glad that you found them (I missed them the first time too).

As for the anomalous changed files, do you have these options set in your 
client option file?

skipacl yes
skipaclupdatecheck yes
updatectime yes

We had similar problems where metadata and ACL updates were interpreted as data 
changes by mmbackup/dsmc.

We also have a case open with IBM where mmbackup will both expire and backup a 
file in the same run, even in the absence of mtime changes, but it's unclear 
whether that's program error or something with our include/exclude rules. I'd 
be curious if you're running into that as well.

On Wed, Jan 26, 2022 at 03:55:48PM +, Paul Ward wrote:
> Good call!
> 
> Yes they are dot files.
> 
> 
> New issue.
> 
> Mmbackup seems to be backup up the same files over and over without them 
> changing:
> areas are being backed up multiple times.
> The example below is a co-resident file, the only thing that has changed 
> since it was created 20/10/21, is the file has been accessed for backup.
> This file is in the 'changed' list in mmbackup:
> 
> This list has just been created:
> -rw-r--r--. 1 root root 6591914 Jan 26 11:12 
> mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> 
> Listing the last few files in the file (selecting the last one)
> 11:17:52 [root@scale-sk-pn-1 .mmbackupCfg]# tail 
> mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604556977.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557039.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557102.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557164.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557226.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557288.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557351.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557413.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557476.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
> 
> Check the file stats (access time just before last backup)
> 11:18:05 [root@scale-sk-pn-1 .mmbackupCfg]# stat 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
>   File: 
> '/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png'
>   Size: 545 Blocks: 32 IO Block: 4194304 regular file
> Device: 2bh/43d Inode: 212618897   Links: 1
> Access: (0644/-rw-r--r--)  Uid: (1399613896/NHM\edwab)   Gid: 
> (1399647564/NHM\dg-mbl-urban-nature-project-rw)
> Context: unconfined_u:object_r:unlabeled_t:s0
> Access: 2022-01-25 06:40:58.334961446 +
> Modify: 2020-12-01 15:20:40.122053000 +
> Change: 2021-10-20 17:55:18.265746459 +0100
> Birth: -
> 
> Check if migrated
> 11:18:16 [root@scale-sk-pn-1 .mmbackupCfg]# dsmls 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
> File name   : 
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
> On-line size

Re: [gpfsug-discuss] mmbackup file selections

2022-02-01 Thread Paul Ward
Not currently set. I'll look into them.


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk


-Original Message-
From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of Skylar Thompson
Sent: 26 January 2022 16:50
To: gpfsug main discussion list 
Subject: Re: [gpfsug-discuss] mmbackup file selections

Awesome, glad that you found them (I missed them the first time too).

As for the anomalous changed files, do you have these options set in your 
client option file?

skipacl yes
skipaclupdatecheck yes
updatectime yes

We had similar problems where metadata and ACL updates were interpreted as data 
changes by mmbackup/dsmc.

We also have a case open with IBM where mmbackup will both expire and backup a 
file in the same run, even in the absence of mtime changes, but it's unclear 
whether that's program error or something with our include/exclude rules. I'd 
be curious if you're running into that as well.

On Wed, Jan 26, 2022 at 03:55:48PM +, Paul Ward wrote:
> Good call!
> 
> Yes they are dot files.
> 
> 
> New issue.
> 
> Mmbackup seems to be backup up the same files over and over without them 
> changing:
> areas are being backed up multiple times.
> The example below is a co-resident file, the only thing that has changed 
> since it was created 20/10/21, is the file has been accessed for backup.
> This file is in the 'changed' list in mmbackup:
> 
> This list has just been created:
> -rw-r--r--. 1 root root 6591914 Jan 26 11:12 
> mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> 
> Listing the last few files in the file (selecting the last one)
> 11:17:52 [root@scale-sk-pn-1 .mmbackupCfg]# tail 
> mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604556977.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557039.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557102.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557164.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557226.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557288.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557351.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557413.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557476.png"
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
> 
> Check the file stats (access time just before last backup)
> 11:18:05 [root@scale-sk-pn-1 .mmbackupCfg]# stat 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
>   File: 
> '/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png'
>   Size: 545 Blocks: 32 IO Block: 4194304 regular file
> Device: 2bh/43d Inode: 212618897   Links: 1
> Access: (0644/-rw-r--r--)  Uid: (1399613896/NHM\edwab)   Gid: 
> (1399647564/NHM\dg-mbl-urban-nature-project-rw)
> Context: unconfined_u:object_r:unlabeled_t:s0
> Access: 2022-01-25 06:40:58.334961446 +
> Modify: 2020-12-01 15:20:40.122053000 +
> Change: 2021-10-20 17:55:18.265746459 +0100
> Birth: -
> 
> Check if migrated
> 11:18:16 [root@scale-sk-pn-1 .mmbackupCfg]# dsmls 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
> File name   : 
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
> On-line size: 545
> Used blocks : 16
> Data Version: 1
> Meta Version: 1
> State   : Co-resident
> Container Index : 1
> Base Name   : 
> 34C0B77D20194B0B.EACEB2055F6CAA58.56D56C5F140C8C9D..2197396D.0CAC4E91
> 
> Check if immutable
> 11:18:26 [root@scale-sk-pn-1 .mmbackupCfg]# mstat 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/works

Re: [gpfsug-discuss] mmbackup file selections

2022-01-26 Thread Skylar Thompson
pool name:data
> fileset name: hpc-workspaces-fset
> snapshot name:
> creation time:Wed Oct 20 17:55:18 2021
> Misc attributes:  ARCHIVE
> Encrypted:no
> 
> Check active and inactive backups (it was backed up yesterday)
> 11:18:52 [root@scale-sk-pn-1 .mmbackupCfg]# dsmcqbi 
> "/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
> IBM Spectrum Protect
> Command Line Backup-Archive Client Interface
>   Client Version 8, Release 1, Level 10.0
>   Client date/time: 01/26/2022 11:19:02
> (c) Copyright by IBM Corporation and other(s) 1990, 2020. All Rights Reserved.
> 
> Node Name: SC-PN-SK-01
> Session established with server TSM-JERSEY: Windows
>   Server Version 8, Release 1, Level 10.100
>   Server date/time: 01/26/2022 11:19:02  Last access: 01/26/2022 11:07:05
> 
> Accessing as node: SCALE
>SizeBackup DateMgmt Class   A/I 
> File
>-----   --- 
> 
>545  B  01/25/2022 06:41:17 DEFAULT  A  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  12/28/2021 21:19:18 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  01/04/2022 06:17:35 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  01/04/2022 06:18:05 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
> 
> 
> It will be backed up again shortly, why?
> 
> And it was backed up again:
> # dsmcqbi 
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
> IBM Spectrum Protect
> Command Line Backup-Archive Client Interface
>   Client Version 8, Release 1, Level 10.0
>   Client date/time: 01/26/2022 15:54:09
> (c) Copyright by IBM Corporation and other(s) 1990, 2020. All Rights Reserved.
> 
> Node Name: SC-PN-SK-01
> Session established with server TSM-JERSEY: Windows
>   Server Version 8, Release 1, Level 10.100
>   Server date/time: 01/26/2022 15:54:10  Last access: 01/26/2022 15:30:03
> 
> Accessing as node: SCALE
>SizeBackup DateMgmt Class   A/I 
> File
>-----   --- 
> 
>545  B  01/26/2022 12:23:02 DEFAULT  A  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  12/28/2021 21:19:18 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  01/04/2022 06:17:35 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  01/04/2022 06:18:05 DEFAULT  I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
>545  B  01/25/2022 06:41:17     DEFAULT      I  
> /gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk
> 
> 
> -Original Message-
> From: gpfsug-discuss-boun...@spectrumscale.org 
>  On Behalf Of Skylar Thompson
> Sent: 24 January 2022 15:37
> To: gpfsug main discussion list 
> Cc: gpfsug-discuss-boun...@spectrumscale.org
> Subject: Re: [gpfsug-discuss] mmbackup file selections
> 
> Hi Paul,
> 
> Did you look for dot files? At least for us on 5.0.5 there's a 
> .list.1. file while the backups are running:
> 
> /gpfs/grc6/.mmbackupCfg/updatedFiles/:
> -r 1 root nickers 6158526821 Jan 23 18:28 .list.1.gpfs-grc6
> /gpfs/grc6/.mmbackupCfg/expiredFiles/:
> -r 1 root nickers 85862211 Jan 23 18:28 .list.1.gpfs-grc6
> 
> On Mon, Jan 24, 2022 at 02:31:54PM +, Paul Ward wrote:
> > Those di

Re: [gpfsug-discuss] mmbackup file selections

2022-01-26 Thread Paul Ward
  --- 
   545  B  01/25/2022 06:41:17 DEFAULT  A  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  12/28/2021 21:19:18 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  01/04/2022 06:17:35 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  01/04/2022 06:18:05 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png


It will be backed up again shortly, why?

And it was backed up again:
# dsmcqbi 
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
IBM Spectrum Protect
Command Line Backup-Archive Client Interface
  Client Version 8, Release 1, Level 10.0
  Client date/time: 01/26/2022 15:54:09
(c) Copyright by IBM Corporation and other(s) 1990, 2020. All Rights Reserved.

Node Name: SC-PN-SK-01
Session established with server TSM-JERSEY: Windows
  Server Version 8, Release 1, Level 10.100
  Server date/time: 01/26/2022 15:54:10  Last access: 01/26/2022 15:30:03

Accessing as node: SCALE
   SizeBackup DateMgmt Class   A/I File
   -----   --- 
   545  B  01/26/2022 12:23:02 DEFAULT  A  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  12/28/2021 21:19:18 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  01/04/2022 06:17:35 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  01/04/2022 06:18:05 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
   545  B  01/25/2022 06:41:17 DEFAULT  I  
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png

Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk


-Original Message-
From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of Skylar Thompson
Sent: 24 January 2022 15:37
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections

Hi Paul,

Did you look for dot files? At least for us on 5.0.5 there's a 
.list.1. file while the backups are running:

/gpfs/grc6/.mmbackupCfg/updatedFiles/:
-r 1 root nickers 6158526821 Jan 23 18:28 .list.1.gpfs-grc6
/gpfs/grc6/.mmbackupCfg/expiredFiles/:
-r 1 root nickers 85862211 Jan 23 18:28 .list.1.gpfs-grc6

On Mon, Jan 24, 2022 at 02:31:54PM +, Paul Ward wrote:
> Those directories are empty
> 
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
> [A picture containing drawing  Description automatically generated]
> 
> From: gpfsug-discuss-boun...@spectrumscale.org 
>  On Behalf Of IBM Spectrum 
> Scale
> Sent: 22 January 2022 00:35
> To: gpfsug main discussion list 
> Cc: gpfsug-discuss-boun...@spectrumscale.org
> Subject: Re: [gpfsug-discuss] mmbackup file selections
> 
> 
> Hi Paul,
> 
> Instead of calculating *.ix.* files,  please look at a list file in these 
> directories.
> 
> updatedFiles  : contains a file that lists all candidates for backup 
> statechFiles  : cantains a file that lists all candidates for meta 
> info update expiredFiles  : cantains a file that lists all candidates 
> for expiration
> 
> Regards, The Spectrum Scale (GPFS) team
> 
> --
> 
> 
> If your query concerns a potential software error in Spectrum Scale (GPFS) 
> and you have an IBM software maintenance contract please contact  
> 1-800-237-5511 in the United States or your local IBM Service Center in other 
> countries.
> 
> 
> [Inactive hide details for "Paul Ward" ---01/21/2022 09:38:49 AM---Thank you 
> Right in the co

Re: [gpfsug-discuss] mmbackup file selections

2022-01-24 Thread Skylar Thompson
Hi Paul,

Did you look for dot files? At least for us on 5.0.5 there's a
.list.1. file while the backups are running:

/gpfs/grc6/.mmbackupCfg/updatedFiles/:
-r 1 root nickers 6158526821 Jan 23 18:28 .list.1.gpfs-grc6
/gpfs/grc6/.mmbackupCfg/expiredFiles/:
-r 1 root nickers 85862211 Jan 23 18:28 .list.1.gpfs-grc6

On Mon, Jan 24, 2022 at 02:31:54PM +, Paul Ward wrote:
> Those directories are empty
> 
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
> [A picture containing drawing  Description automatically generated]
> 
> From: gpfsug-discuss-boun...@spectrumscale.org 
>  On Behalf Of IBM Spectrum Scale
> Sent: 22 January 2022 00:35
> To: gpfsug main discussion list 
> Cc: gpfsug-discuss-boun...@spectrumscale.org
> Subject: Re: [gpfsug-discuss] mmbackup file selections
> 
> 
> Hi Paul,
> 
> Instead of calculating *.ix.* files,  please look at a list file in these 
> directories.
> 
> updatedFiles  : contains a file that lists all candidates for backup
> statechFiles  : cantains a file that lists all candidates for meta info update
> expiredFiles  : cantains a file that lists all candidates for expiration
> 
> Regards, The Spectrum Scale (GPFS) team
> 
> --
> 
> If your query concerns a potential software error in Spectrum Scale (GPFS) 
> and you have an IBM software maintenance contract please contact  
> 1-800-237-5511 in the United States or your local IBM Service Center in other 
> countries.
> 
> 
> [Inactive hide details for "Paul Ward" ---01/21/2022 09:38:49 AM---Thank you 
> Right in the command line seems to have worked.]"Paul Ward" ---01/21/2022 
> 09:38:49 AM---Thank you Right in the command line seems to have worked.
> 
> From: "Paul Ward" mailto:p.w...@nhm.ac.uk>>
> To: "gpfsug main discussion list" 
> mailto:gpfsug-discuss@spectrumscale.org>>
> Cc: 
> "gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>"
>  
> mailto:gpfsug-discuss-boun...@spectrumscale.org>>
> Date: 01/21/2022 09:38 AM
> Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
> Sent by: 
> gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>
> 
> 
> 
> 
> 
> Thank you Right in the command line seems to have worked. At the end of the 
> script I now copy the contents of the .mmbackupCfg folder to a date stamped 
> logging folder Checking how many entries in these files compared to the 
> Summary: ???ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
> ZjQcmQRYFpfptBannerEnd
> Thank you
> 
> Right in the command line seems to have worked.
> At the end of the script I now copy the contents of the .mmbackupCfg folder 
> to a date stamped logging folder
> 
> Checking how many entries in these files compared to the Summary:
> wc -l mmbackup*
>   188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
>47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
>   188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
>40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
> 2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
>   141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
>   148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
>   754 total
> From Summary
> Total number of objects inspected:  755
> I can live with a discrepancy of 1.
> 
> 2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
> From Summary
> Total number of objects expired:2
> That matches
> 
> wc -l mmbackupC* mmbackupS*
>   188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
>47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
>   188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
>40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
>   141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
>   148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
>   752 total
> Summary:
> Total number of objects backed up:  751
> 
> A difference of 1 I can live with.
> 
> What does Statech stand for?
> 
> Just this to sort out:
> Total number of objects failed: 1
> I will add:
> --tsm-errorlog TSMErrorLogFile
> 
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
> [A picture containing drawin

Re: [gpfsug-discuss] mmbackup file selections

2022-01-24 Thread Paul Ward
Those directories are empty


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawing  Description automatically generated]

From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of IBM Spectrum Scale
Sent: 22 January 2022 00:35
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections


Hi Paul,

Instead of calculating *.ix.* files,  please look at a list file in these 
directories.

updatedFiles  : contains a file that lists all candidates for backup
statechFiles  : cantains a file that lists all candidates for meta info update
expiredFiles  : cantains a file that lists all candidates for expiration

Regards, The Spectrum Scale (GPFS) team

--

If your query concerns a potential software error in Spectrum Scale (GPFS) and 
you have an IBM software maintenance contract please contact  1-800-237-5511 in 
the United States or your local IBM Service Center in other countries.


[Inactive hide details for "Paul Ward" ---01/21/2022 09:38:49 AM---Thank you 
Right in the command line seems to have worked.]"Paul Ward" ---01/21/2022 
09:38:49 AM---Thank you Right in the command line seems to have worked.

From: "Paul Ward" mailto:p.w...@nhm.ac.uk>>
To: "gpfsug main discussion list" 
mailto:gpfsug-discuss@spectrumscale.org>>
Cc: 
"gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>"
 
mailto:gpfsug-discuss-boun...@spectrumscale.org>>
Date: 01/21/2022 09:38 AM
Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
Sent by: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>





Thank you Right in the command line seems to have worked. At the end of the 
script I now copy the contents of the .mmbackupCfg folder to a date stamped 
logging folder Checking how many entries in these files compared to the 
Summary: ‍‍‍ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Thank you

Right in the command line seems to have worked.
At the end of the script I now copy the contents of the .mmbackupCfg folder to 
a date stamped logging folder

Checking how many entries in these files compared to the Summary:
wc -l mmbackup*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  754 total
>From Summary
Total number of objects inspected:  755
I can live with a discrepancy of 1.

2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
>From Summary
Total number of objects expired:2
That matches

wc -l mmbackupC* mmbackupS*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  752 total
Summary:
Total number of objects backed up:  751

A difference of 1 I can live with.

What does Statech stand for?

Just this to sort out:
Total number of objects failed: 1
I will add:
--tsm-errorlog TSMErrorLogFile


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawingDescription automatically generated]

From: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>
 
mailto:gpfsug-discuss-boun...@spectrumscale.org>>
 On Behalf Of IBM Spectrum Scale
Sent: 19 January 2022 15:09
To: gpfsug main discussion list 
mailto:gpfsug-discuss@spectrumscale.org>>
Cc: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>
Subject: Re: [gpfsug-discuss] mmbackup file selections


This is to set environment for mmbackup.
If mmbackup is invoked within a script, you can set "export DEBUGmmbackup=2" 
right above mmbackup command.
e.g)  in your script
   
export DEBUGmmbackup=2
 mmbackup 

Or, you can set it in the same command line like
DEBUGmmbackup=2 mmbackup 

Regards, The Spectrum Scale (GPFS) team

--
If your que

Re: [gpfsug-discuss] mmbackup file selections

2022-01-21 Thread IBM Spectrum Scale


Hi Paul,



Instead of calculating *.ix.* files,  please look at a list file in these
directories.


updatedFiles  : contains a file that lists all candidates for backup
statechFiles  : cantains a file that lists all candidates for meta info
update
expiredFiles  : cantains a file that lists all candidates for expiration



Regards, The Spectrum Scale (GPFS) team

--


If your query concerns a potential software error in Spectrum Scale (GPFS)
and you have an IBM software maintenance contract please contact
1-800-237-5511 in the United States or your local IBM Service Center in
other countries.




From:   "Paul Ward" 
To: "gpfsug main discussion list"

Cc: "gpfsug-discuss-boun...@spectrumscale.org"

Date:   01/21/2022 09:38 AM
Subject:    [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
Sent by:gpfsug-discuss-boun...@spectrumscale.org



Thank you Right in the command line seems to have worked. At the end of the
script I now copy the contents of the .mmbackupCfg folder to a date stamped
logging folder Checking how many entries in these files compared to the
Summary: ‍‍‍ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Thank you

Right in the command line seems to have worked.
At the end of the script I now copy the contents of the .mmbackupCfg folder
to a date stamped logging folder

Checking how many entries in these files compared to the Summary:
wc -l mmbackup*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  754 total
From Summary
Total number of objects inspected:  755
I can live with a discrepancy of 1.

2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
From Summary
Total number of objects expired:2
That matches

wc -l mmbackupC* mmbackupS*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  752 total
Summary:
Total number of objects backed up:  751

A difference of 1 I can live with.

What does Statech stand for?

Just this to sort out:
Total number of objects failed: 1
I will add:
--tsm-errorlog TSMErrorLogFile


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk
A picture containing drawing

Description automatically generated

From: gpfsug-discuss-boun...@spectrumscale.org
 On Behalf Of IBM Spectrum Scale
Sent: 19 January 2022 15:09
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections



This is to set environment for mmbackup.
If mmbackup is invoked within a script, you can set "export
DEBUGmmbackup=2" right above mmbackup command.
e.g)  in your script

export DEBUGmmbackup=2
  mmbackup 

Or, you can set it in the same command line like
DEBUGmmbackup=2 mmbackup 

Regards, The Spectrum Scale (GPFS) team

--

If your query concerns a potential software error in Spectrum Scale (GPFS)
and you have an IBM software maintenance contract please contact
1-800-237-5511 in the United States or your local IBM Service Center in
other countries.

Inactive hide details for "Paul Ward" ---01/19/2022 06:04:03 AM---Thank
you. We run a script on all our nodes that checks to se"Paul Ward"
---01/19/2022 06:04:03 AM---Thank you. We run a script on all our nodes
that checks to see if they are the cluster manager.

From: "Paul Ward" 
To: "gpfsug main discussion list" 
Cc: "gpfsug-discuss-boun...@spectrumscale.org" <
gpfsug-discuss-boun...@spectrumscale.org>
Date: 01/19/2022 06:04 AM
Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
Sent by: gpfsug-discuss-boun...@spectrumscale.org




Thank you. We run a script on all our nodes that checks to see if they are
the cluster manager. If they are, then they take responsibility to start
the backup script. The script then randomly selects one of the available
backup nodes and uses ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd

Re: [gpfsug-discuss] mmbackup file selections

2022-01-21 Thread Paul Ward
Got my vote.


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk


-Original Message-
From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of Skylar Thompson
Sent: 19 January 2022 15:40
To: gpfsug-discuss@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections

Hi Paul,

Not to toot my own horn and while the DEBUGmmbackup=2 method definitely does 
work, you might want to vote for this RFE I put in a month ago to get a more 
robust "dry run" mode with mmbackup, since guessing how include/exclude rules 
get translated from SP/dsmc to SS/mmbackup can be
challenging:

https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ibm.com%2Fdeveloperworks%2Frfe%2Fexecute%3Fuse_case%3DviewRfe%26CR_ID%3D153520data=04%7C01%7Cp.ward%40nhm.ac.uk%7C1898b9c0b030441112e708d9db61fb44%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637782037020043251%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000sdata=WCBNcdWnDtZL%2B8qPahZO0gFFdBAW1X%2BPyL6pI71r804%3Dreserved=0

Somewhat selfishly, I think implementing the RFE would benefit you as well.
:)

On Tue, Jan 18, 2022 at 04:56:17PM +, Paul Ward wrote:
> Hi,
> 
> I am trying to work out what files have been sent to backup using mmbackup.
> I have increased the -L value from 3 up to 6 but only seem to see the files 
> that are in scope, not the ones that are selected.
> 
> I can see the three file lists generated during a backup, but can't seem to 
> find a list of what files were backed up.
> 
> It should be the diff of the shadow and shadow-old, but the wc -l of the diff 
> doesn't match the number of files in the backup summary.
> Wrong assumption?
> 
> Where should I be looking - surely it shouldn't be this hard to see what 
> files are selected?
> 
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
> [A picture containing drawing  Description automatically generated]
> 



> ___
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsu
> g.org%2Fmailman%2Flistinfo%2Fgpfsug-discussdata=04%7C01%7Cp.ward%
> 40nhm.ac.uk%7C1898b9c0b030441112e708d9db61fb44%7C73a29c014e78437fa0d4c
> 8553e1960c1%7C1%7C0%7C637782037020043251%7CUnknown%7CTWFpbGZsb3d8eyJWI
> joiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000
> mp;sdata=8NvIypofEOlGwDKQpZA6aorxKo5G0hYWt9mi9QZe9x8%3Dreserved=0


--
-- Skylar Thompson (skyl...@u.washington.edu)
-- Genome Sciences Department (UW Medicine), System Administrator
-- Foege Building S046, (206)-685-7354
-- Pronouns: He/Him/His
___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discussdata=04%7C01%7Cp.ward%40nhm.ac.uk%7C1898b9c0b030441112e708d9db61fb44%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637782037020043251%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000sdata=8NvIypofEOlGwDKQpZA6aorxKo5G0hYWt9mi9QZe9x8%3Dreserved=0
___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


Re: [gpfsug-discuss] mmbackup file selections

2022-01-21 Thread Paul Ward
Thank you

Right in the command line seems to have worked.
At the end of the script I now copy the contents of the .mmbackupCfg folder to 
a date stamped logging folder

Checking how many entries in these files compared to the Summary:
wc -l mmbackup*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  754 total
>From Summary
Total number of objects inspected:  755
I can live with a discrepancy of 1.

2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
>From Summary
Total number of objects expired:2
That matches

wc -l mmbackupC* mmbackupS*
  188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
   47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
  188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
   40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
  141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
  148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
  752 total
Summary:
Total number of objects backed up:  751

A difference of 1 I can live with.

What does Statech stand for?

Just this to sort out:
Total number of objects failed: 1
I will add:
--tsm-errorlog TSMErrorLogFile


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawing  Description automatically generated]

From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of IBM Spectrum Scale
Sent: 19 January 2022 15:09
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections


This is to set environment for mmbackup.
If mmbackup is invoked within a script, you can set "export DEBUGmmbackup=2" 
right above mmbackup command.
e.g)  in your script

export DEBUGmmbackup=2
  mmbackup 

Or, you can set it in the same command line like
DEBUGmmbackup=2 mmbackup 

Regards, The Spectrum Scale (GPFS) team

--
If your query concerns a potential software error in Spectrum Scale (GPFS) and 
you have an IBM software maintenance contract please contact  1-800-237-5511 in 
the United States or your local IBM Service Center in other countries.

[Inactive hide details for "Paul Ward" ---01/19/2022 06:04:03 AM---Thank you. 
We run a script on all our nodes that checks to se]"Paul Ward" ---01/19/2022 
06:04:03 AM---Thank you. We run a script on all our nodes that checks to see if 
they are the cluster manager.

From: "Paul Ward" mailto:p.w...@nhm.ac.uk>>
To: "gpfsug main discussion list" 
mailto:gpfsug-discuss@spectrumscale.org>>
Cc: 
"gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>"
 
mailto:gpfsug-discuss-boun...@spectrumscale.org>>
Date: 01/19/2022 06:04 AM
Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
Sent by: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>





Thank you. We run a script on all our nodes that checks to see if they are the 
cluster manager. If they are, then they take responsibility to start the backup 
script. The script then randomly selects one of the available backup nodes and 
uses ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Thank you.

We run a script on all our nodes that checks to see if they are the cluster 
manager.
If they are, then they take responsibility to start the backup script.
The script then randomly selects one of the available backup nodes and uses 
dsmsh mmbackup on it.

Where does this command belong?
I have seen it listed as a export command, again where should that be run - on 
all backup nodes, or all nodes?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawingDescription automatically generated]

From: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>
 
mailto:gpfsug-discuss-boun...@spectrumscale.org>>
 On Behalf Of IBM Spectrum Scale
Sent: 18 January 2022 22:54
To: gpfsug main discussion list 
mailto:gpfsug-discuss@spectrumscale.org>>
Cc: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>
Subject: Re: [gpfsug-discuss] mmbackup file selections


Hi Paul,

If you run mmbackup with "DEBUGmmbackup=2&

Re: [gpfsug-discuss] mmbackup file selections

2022-01-19 Thread Skylar Thompson
Hi Paul,

Not to toot my own horn and while the DEBUGmmbackup=2 method definitely
does work, you might want to vote for this RFE I put in a month ago to get
a more robust "dry run" mode with mmbackup, since guessing how
include/exclude rules get translated from SP/dsmc to SS/mmbackup can be
challenging:

https://www.ibm.com/developerworks/rfe/execute?use_case=viewRfe_ID=153520

Somewhat selfishly, I think implementing the RFE would benefit you as well.
:)

On Tue, Jan 18, 2022 at 04:56:17PM +, Paul Ward wrote:
> Hi,
> 
> I am trying to work out what files have been sent to backup using mmbackup.
> I have increased the -L value from 3 up to 6 but only seem to see the files 
> that are in scope, not the ones that are selected.
> 
> I can see the three file lists generated during a backup, but can't seem to 
> find a list of what files were backed up.
> 
> It should be the diff of the shadow and shadow-old, but the wc -l of the diff 
> doesn't match the number of files in the backup summary.
> Wrong assumption?
> 
> Where should I be looking - surely it shouldn't be this hard to see what 
> files are selected?
> 
> 
> Kindest regards,
> Paul
> 
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: p.w...@nhm.ac.uk
> [A picture containing drawing  Description automatically generated]
> 



> ___
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> http://gpfsug.org/mailman/listinfo/gpfsug-discuss


-- 
-- Skylar Thompson (skyl...@u.washington.edu)
-- Genome Sciences Department (UW Medicine), System Administrator
-- Foege Building S046, (206)-685-7354
-- Pronouns: He/Him/His
___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


Re: [gpfsug-discuss] mmbackup file selections

2022-01-19 Thread IBM Spectrum Scale


This is to set environment for mmbackup.
If mmbackup is invoked within a script, you can set "export DEBUGmmbackup=2
" right above mmbackup command.
e.g)  in your script

export DEBUGmmbackup=2
  mmbackup 

Or, you can set it in the same command line like
DEBUGmmbackup=2 mmbackup 

Regards, The Spectrum Scale (GPFS) team

--

If your query concerns a potential software error in Spectrum Scale (GPFS)
and you have an IBM software maintenance contract please contact
1-800-237-5511 in the United States or your local IBM Service Center in
other countries.



From:   "Paul Ward" 
To: "gpfsug main discussion list"

Cc: "gpfsug-discuss-boun...@spectrumscale.org"

Date:   01/19/2022 06:04 AM
Subject:[EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections
Sent by:gpfsug-discuss-boun...@spectrumscale.org



Thank you. We run a script on all our nodes that checks to see if they are
the cluster manager. If they are, then they take responsibility to start
the backup script. The script then randomly selects one of the available
backup nodes and uses ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Thank you.

We run a script on all our nodes that checks to see if they are the cluster
manager.
If they are, then they take responsibility to start the backup script.
The script then randomly selects one of the available backup nodes and uses
dsmsh mmbackup on it.

Where does this command belong?
I have seen it listed as a export command, again where should that be run –
on all backup nodes, or all nodes?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk
A picture containing drawing

Description automatically generated

From: gpfsug-discuss-boun...@spectrumscale.org
 On Behalf Of IBM Spectrum Scale
Sent: 18 January 2022 22:54
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections



Hi Paul,

If you run mmbackup with "DEBUGmmbackup=2", it keeps all working files even
after successful backup. They are available at MMBACKUP_RECORD_ROOT
(default is FSroot or FilesetRoot directory).
In .mmbackupCfg directory, there are 3 directories:
updatedFiles  : contains a file that lists all candidates for backup
statechFiles  : cantains a file that lists all candidates for meta info
update
expiredFiles  : cantains a file that lists all candidates for expiration


Regards, The Spectrum Scale (GPFS) team

--

If your query concerns a potential software error in Spectrum Scale (GPFS)
and you have an IBM software maintenance contract please contact
1-800-237-5511 in the United States or your local IBM Service Center in
other countries.

Inactive hide details for "Paul Ward" ---01/18/2022 11:56:40 AM---Hi, I am
trying to work out what files have been sent to back"Paul Ward"
---01/18/2022 11:56:40 AM---Hi, I am trying to work out what files have
been sent to backup using mmbackup.

From: "Paul Ward" 
To: "gpfsug-discuss@spectrumscale.org" 
Date: 01/18/2022 11:56 AM
Subject: [EXTERNAL] [gpfsug-discuss] mmbackup file selections
Sent by: gpfsug-discuss-boun...@spectrumscale.org




Hi, I am trying to work out what files have been sent to backup using
mmbackup. I have increased the -L value from 3 up to 6 but only seem to see
the files that are in scope, not the ones that are selected. I can see the
three file lists generated ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Hi,

I am trying to work out what files have been sent to backup using mmbackup.
I have increased the -L value from 3 up to 6 but only seem to see the files
that are in scope, not the ones that are selected.

I can see the three file lists generated during a backup, but can’t seem to
find a list of what files were backed up.

It should be the diff of the shadow and shadow-old, but the wc -l of the
diff doesn’t match the number of files in the backup summary.
Wrong assumption?

Where should I be looking – surely it shouldn’t be this hard to see what
files are selected?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk
A picture containing drawing

Description automatically generated
 ___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


__

Re: [gpfsug-discuss] mmbackup file selections

2022-01-19 Thread Paul Ward
Thank you.

We run a script on all our nodes that checks to see if they are the cluster 
manager.
If they are, then they take responsibility to start the backup script.
The script then randomly selects one of the available backup nodes and uses 
dsmsh mmbackup on it.

Where does this command belong?
I have seen it listed as a export command, again where should that be run - on 
all backup nodes, or all nodes?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawing  Description automatically generated]

From: gpfsug-discuss-boun...@spectrumscale.org 
 On Behalf Of IBM Spectrum Scale
Sent: 18 January 2022 22:54
To: gpfsug main discussion list 
Cc: gpfsug-discuss-boun...@spectrumscale.org
Subject: Re: [gpfsug-discuss] mmbackup file selections


Hi Paul,

If you run mmbackup with "DEBUGmmbackup=2", it keeps all working files even 
after successful backup. They are available at MMBACKUP_RECORD_ROOT (default is 
FSroot or FilesetRoot directory).
In .mmbackupCfg directory, there are 3 directories:
updatedFiles  : contains a file that lists all candidates for backup
statechFiles  : cantains a file that lists all candidates for meta info update
expiredFiles  : cantains a file that lists all candidates for expiration


Regards, The Spectrum Scale (GPFS) team

--
If your query concerns a potential software error in Spectrum Scale (GPFS) and 
you have an IBM software maintenance contract please contact  1-800-237-5511 in 
the United States or your local IBM Service Center in other countries.

[Inactive hide details for "Paul Ward" ---01/18/2022 11:56:40 AM---Hi, I am 
trying to work out what files have been sent to back]"Paul Ward" ---01/18/2022 
11:56:40 AM---Hi, I am trying to work out what files have been sent to backup 
using mmbackup.

From: "Paul Ward" mailto:p.w...@nhm.ac.uk>>
To: "gpfsug-discuss@spectrumscale.org<mailto:gpfsug-discuss@spectrumscale.org>" 
mailto:gpfsug-discuss@spectrumscale.org>>
Date: 01/18/2022 11:56 AM
Subject: [EXTERNAL] [gpfsug-discuss] mmbackup file selections
Sent by: 
gpfsug-discuss-boun...@spectrumscale.org<mailto:gpfsug-discuss-boun...@spectrumscale.org>





Hi, I am trying to work out what files have been sent to backup using mmbackup. 
I have increased the -L value from 3 up to 6 but only seem to see the files 
that are in scope, not the ones that are selected. I can see the three file 
lists generated ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Hi,

I am trying to work out what files have been sent to backup using mmbackup.
I have increased the -L value from 3 up to 6 but only seem to see the files 
that are in scope, not the ones that are selected.

I can see the three file lists generated during a backup, but can't seem to 
find a list of what files were backed up.

It should be the diff of the shadow and shadow-old, but the wc -l of the diff 
doesn't match the number of files in the backup summary.
Wrong assumption?

Where should I be looking - surely it shouldn't be this hard to see what files 
are selected?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk<mailto:p.w...@nhm.ac.uk>
[A picture containing drawingDescription automatically generated]
 ___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss=04%7C01%7Cp.ward%40nhm.ac.uk%7C40f0543efb6844d63b2108d9dad56ba0%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637781432884760127%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000=sHFsEGqw52Nc4SL0K9%2Fg%2BKn9nGHhsSpo2F%2BZP4bwQnc%3D=0>



___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


Re: [gpfsug-discuss] mmbackup file selections

2022-01-18 Thread IBM Spectrum Scale


Hi Paul,

If you run mmbackup with "DEBUGmmbackup=2", it keeps all working files even
after successful backup. They are available at MMBACKUP_RECORD_ROOT
(default is FSroot or FilesetRoot directory).
In .mmbackupCfg directory, there are 3 directories:
updatedFiles  : contains a file that lists all candidates for backup
statechFiles  : cantains a file that lists all candidates for meta
info update
expiredFiles  : cantains a file that lists all candidates for
expiration


Regards, The Spectrum Scale (GPFS) team

--

If your query concerns a potential software error in Spectrum Scale (GPFS)
and you have an IBM software maintenance contract please contact
1-800-237-5511 in the United States or your local IBM Service Center in
other countries.



From:   "Paul Ward" 
To: "gpfsug-discuss@spectrumscale.org"

Date:   01/18/2022 11:56 AM
Subject:    [EXTERNAL] [gpfsug-discuss] mmbackup file selections
Sent by:gpfsug-discuss-boun...@spectrumscale.org



Hi, I am trying to work out what files have been sent to backup using
mmbackup. I have increased the -L value from 3 up to 6 but only seem to see
the files that are in scope, not the ones that are selected. I can see the
three file lists generated ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.
ZjQcmQRYFpfptBannerEnd
Hi,

I am trying to work out what files have been sent to backup using mmbackup.
I have increased the -L value from 3 up to 6 but only seem to see the files
that are in scope, not the ones that are selected.

I can see the three file lists generated during a backup, but can’t seem to
find a list of what files were backed up.

It should be the diff of the shadow and shadow-old, but the wc -l of the
diff doesn’t match the number of files in the backup summary.
Wrong assumption?

Where should I be looking – surely it shouldn’t be this hard to see what
files are selected?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk
A picture containing drawing

Description automatically generated
 ___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss



___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss


[gpfsug-discuss] mmbackup file selections

2022-01-18 Thread Paul Ward
Hi,

I am trying to work out what files have been sent to backup using mmbackup.
I have increased the -L value from 3 up to 6 but only seem to see the files 
that are in scope, not the ones that are selected.

I can see the three file lists generated during a backup, but can't seem to 
find a list of what files were backed up.

It should be the diff of the shadow and shadow-old, but the wc -l of the diff 
doesn't match the number of files in the backup summary.
Wrong assumption?

Where should I be looking - surely it shouldn't be this hard to see what files 
are selected?


Kindest regards,
Paul

Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: p.w...@nhm.ac.uk
[A picture containing drawing  Description automatically generated]

___
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss