Good call!
Yes they are dot files.
New issue.
Mmbackup seems to be backup up the same files over and over without them
changing:
areas are being backed up multiple times.
The example below is a co-resident file, the only thing that has changed since
it was created 20/10/21, is the file has been accessed for backup.
This file is in the 'changed' list in mmbackup:
This list has just been created:
-rw-r--r--. 1 root root 6591914 Jan 26 11:12
mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
Listing the last few files in the file (selecting the last one)
11:17:52 [root@scale-sk-pn-1 .mmbackupCfg]# tail
mmbackupChanged.ix.197984.22A38AA7.39.nhmfsa
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604556977.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557039.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557102.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557164.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557226.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557288.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557351.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557413.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557476.png"
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
Check the file stats (access time just before last backup)
11:18:05 [root@scale-sk-pn-1 .mmbackupCfg]# stat
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
File:
'/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png'
Size: 545 Blocks: 32 IO Block: 4194304 regular file
Device: 2bh/43d Inode: 212618897 Links: 1
Access: (0644/-rw-r--r--) Uid: (1399613896/NHM\edwab) Gid:
(1399647564/NHM\dg-mbl-urban-nature-project-rw)
Context: unconfined_u:object_r:unlabeled_t:s0
Access: 2022-01-25 06:40:58.334961446 +0000
Modify: 2020-12-01 15:20:40.122053000 +0000
Change: 2021-10-20 17:55:18.265746459 +0100
Birth: -
Check if migrated
11:18:16 [root@scale-sk-pn-1 .mmbackupCfg]# dsmls
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
File name :
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
On-line size : 545
Used blocks : 16
Data Version : 1
Meta Version : 1
State : Co-resident
Container Index : 1
Base Name :
34C0B77D20194B0B.EACEB2055F6CAA58.56D56C5F140C8C9D.0000000000000000.2197396D.000000000CAC4E91
Check if immutable
11:18:26 [root@scale-sk-pn-1 .mmbackupCfg]# mstat
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
file name:
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
metadata replication: 2 max 2
data replication: 2 max 2
immutable: no
appendOnly: no
flags:
storage pool name: data
fileset name: hpc-workspaces-fset
snapshot name:
creation time: Wed Oct 20 17:55:18 2021
Misc attributes: ARCHIVE
Encrypted: no
Check active and inactive backups (it was backed up yesterday)
11:18:52 [root@scale-sk-pn-1 .mmbackupCfg]# dsmcqbi
"/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png"
IBM Spectrum Protect
Command Line Backup-Archive Client Interface
Client Version 8, Release 1, Level 10.0
Client date/time: 01/26/2022 11:19:02
(c) Copyright by IBM Corporation and other(s) 1990, 2020. All Rights Reserved.
Node Name: SC-PN-SK-01
Session established with server TSM-JERSEY: Windows
Server Version 8, Release 1, Level 10.100
Server date/time: 01/26/2022 11:19:02 Last access: 01/26/2022 11:07:05
Accessing as node: SCALE
Size Backup Date Mgmt Class A/I File
---- ----------- ---------- --- ----
545 B 01/25/2022 06:41:17 DEFAULT A
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 12/28/2021 21:19:18 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 01/04/2022 06:17:35 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 01/04/2022 06:18:05 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
It will be backed up again shortly, why?
And it was backed up again:
# dsmcqbi
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
IBM Spectrum Protect
Command Line Backup-Archive Client Interface
Client Version 8, Release 1, Level 10.0
Client date/time: 01/26/2022 15:54:09
(c) Copyright by IBM Corporation and other(s) 1990, 2020. All Rights Reserved.
Node Name: SC-PN-SK-01
Session established with server TSM-JERSEY: Windows
Server Version 8, Release 1, Level 10.100
Server date/time: 01/26/2022 15:54:10 Last access: 01/26/2022 15:30:03
Accessing as node: SCALE
Size Backup Date Mgmt Class A/I File
---- ----------- ---------- --- ----
545 B 01/26/2022 12:23:02 DEFAULT A
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 12/28/2021 21:19:18 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 01/04/2022 06:17:35 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 01/04/2022 06:18:05 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
545 B 01/25/2022 06:41:17 DEFAULT I
/gpfs/nhmfsa/bulk/share/data/mbl/share/workspaces/groups/urban-nature-project/audiowaveform/300_40/unp-grounds-01-1604557538.png
Kindest regards,
Paul
Paul Ward
TS Infrastructure Architect
Natural History Museum
T: 02079426450
E: [email protected]
-----Original Message-----
From: [email protected]
<[email protected]> On Behalf Of Skylar Thompson
Sent: 24 January 2022 15:37
To: gpfsug main discussion list <[email protected]>
Cc: [email protected]
Subject: Re: [gpfsug-discuss] mmbackup file selections
Hi Paul,
Did you look for dot files? At least for us on 5.0.5 there's a
.list.1.<tsm-node> file while the backups are running:
/gpfs/grc6/.mmbackupCfg/updatedFiles/:
-r-------- 1 root nickers 6158526821 Jan 23 18:28 .list.1.gpfs-grc6
/gpfs/grc6/.mmbackupCfg/expiredFiles/:
-r-------- 1 root nickers 85862211 Jan 23 18:28 .list.1.gpfs-grc6
On Mon, Jan 24, 2022 at 02:31:54PM +0000, Paul Ward wrote:
> Those directories are empty
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: [email protected]<mailto:[email protected]>
> [A picture containing drawing Description automatically generated]
>
> From: [email protected]
> <[email protected]> On Behalf Of IBM Spectrum
> Scale
> Sent: 22 January 2022 00:35
> To: gpfsug main discussion list <[email protected]>
> Cc: [email protected]
> Subject: Re: [gpfsug-discuss] mmbackup file selections
>
>
> Hi Paul,
>
> Instead of calculating *.ix.* files, please look at a list file in these
> directories.
>
> updatedFiles : contains a file that lists all candidates for backup
> statechFiles : cantains a file that lists all candidates for meta
> info update expiredFiles : cantains a file that lists all candidates
> for expiration
>
> Regards, The Spectrum Scale (GPFS) team
>
> ----------------------------------------------------------------------
> --------------------------------------------
>
> If your query concerns a potential software error in Spectrum Scale (GPFS)
> and you have an IBM software maintenance contract please contact
> 1-800-237-5511 in the United States or your local IBM Service Center in other
> countries.
>
>
> [Inactive hide details for "Paul Ward" ---01/21/2022 09:38:49 AM---Thank you
> Right in the command line seems to have worked.]"Paul Ward" ---01/21/2022
> 09:38:49 AM---Thank you Right in the command line seems to have worked.
>
> From: "Paul Ward" <[email protected]<mailto:[email protected]>>
> To: "gpfsug main discussion list"
> <[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>>
> Cc:
> "[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>"
> <[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>>
> Date: 01/21/2022 09:38 AM
> Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections Sent
> by:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
>
> ________________________________
>
>
>
> Thank you Right in the command line seems to have worked. At the end
> of the script I now copy the contents of the .mmbackupCfg folder to a
> date stamped logging folder Checking how many entries in these files compared
> to the Summary: ???????ZjQcmQRYFpfptBannerStart This Message Is From an
> External Sender This message came from outside your organization.
> ZjQcmQRYFpfptBannerEnd
> Thank you
>
> Right in the command line seems to have worked.
> At the end of the script I now copy the contents of the .mmbackupCfg
> folder to a date stamped logging folder
>
> Checking how many entries in these files compared to the Summary:
> wc -l mmbackup*
> 188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
> 47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
> 188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
> 40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
> 2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
> 141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
> 148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
> 754 total
> From Summary
> Total number of objects inspected: 755
> I can live with a discrepancy of 1.
>
> 2 mmbackupExpired.ix.78683.2DD25239.1.nhmfsa
> From Summary
> Total number of objects expired: 2
> That matches
>
> wc -l mmbackupC* mmbackupS*
> 188 mmbackupChanged.ix.155513.6E9E8BE2.1.nhmfsa
> 47 mmbackupChanged.ix.219901.8E89AB35.1.nhmfsa
> 188 mmbackupChanged.ix.37893.EDFB8FA7.1.nhmfsa
> 40 mmbackupChanged.ix.81032.78717A00.1.nhmfsa
> 141 mmbackupStatech.ix.219901.8E89AB35.1.nhmfsa
> 148 mmbackupStatech.ix.81032.78717A00.1.nhmfsa
> 752 total
> Summary:
> Total number of objects backed up: 751
>
> A difference of 1 I can live with.
>
> What does Statech stand for?
>
> Just this to sort out:
> Total number of objects failed: 1
> I will add:
> --tsm-errorlog TSMErrorLogFile
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: [email protected]<mailto:[email protected]>
> [A picture containing drawing Description automatically generated]
>
> From:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
> <[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>> On Behalf Of IBM Spectrum Scale
> Sent: 19 January 2022 15:09
> To: gpfsug main discussion list
> <[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>>
> Cc:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
> Subject: Re: [gpfsug-discuss] mmbackup file selections
>
>
> This is to set environment for mmbackup.
> If mmbackup is invoked within a script, you can set "export DEBUGmmbackup=2"
> right above mmbackup command.
> e.g) in your script
> ....
> export DEBUGmmbackup=2
> mmbackup ....
>
> Or, you can set it in the same command line like
> DEBUGmmbackup=2 mmbackup ....
>
> Regards, The Spectrum Scale (GPFS) team
>
> ----------------------------------------------------------------------
> --------------------------------------------
> If your query concerns a potential software error in Spectrum Scale (GPFS)
> and you have an IBM software maintenance contract please contact
> 1-800-237-5511 in the United States or your local IBM Service Center in other
> countries.
>
> [Inactive hide details for "Paul Ward" ---01/19/2022 06:04:03 AM---Thank you.
> We run a script on all our nodes that checks to se]"Paul Ward" ---01/19/2022
> 06:04:03 AM---Thank you. We run a script on all our nodes that checks to see
> if they are the cluster manager.
>
> From: "Paul Ward" <[email protected]<mailto:[email protected]>>
> To: "gpfsug main discussion list"
> <[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>>
> Cc:
> "[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>"
> <[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>>
> Date: 01/19/2022 06:04 AM
> Subject: [EXTERNAL] Re: [gpfsug-discuss] mmbackup file selections Sent
> by:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
>
> ________________________________
>
>
>
>
> Thank you. We run a script on all our nodes that checks to see if they
> are the cluster manager. If they are, then they take responsibility to
> start the backup script. The script then randomly selects one of the
> available backup nodes and uses ZjQcmQRYFpfptBannerStart This Message Is From
> an External Sender This message came from outside your organization.
> ZjQcmQRYFpfptBannerEnd
> Thank you.
>
> We run a script on all our nodes that checks to see if they are the cluster
> manager.
> If they are, then they take responsibility to start the backup script.
> The script then randomly selects one of the available backup nodes and uses
> dsmsh mmbackup on it.
>
> Where does this command belong?
> I have seen it listed as a export command, again where should that be run ?
> on all backup nodes, or all nodes?
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: [email protected]<mailto:[email protected]>
> [A picture containing drawing Description automatically generated]
>
> From:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
> <[email protected]<mailto:gpfsug-discuss-bounce
> [email protected]>> On Behalf Of IBM Spectrum Scale
> Sent: 18 January 2022 22:54
> To: gpfsug main discussion list
> <[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>>
> Cc:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
> Subject: Re: [gpfsug-discuss] mmbackup file selections
>
> Hi Paul,
>
> If you run mmbackup with "DEBUGmmbackup=2", it keeps all working files even
> after successful backup. They are available at MMBACKUP_RECORD_ROOT (default
> is FSroot or FilesetRoot directory).
> In .mmbackupCfg directory, there are 3 directories:
> updatedFiles : contains a file that lists all candidates for backup
> statechFiles : cantains a file that lists all candidates for meta
> info update expiredFiles : cantains a file that lists all candidates
> for expiration
>
>
> Regards, The Spectrum Scale (GPFS) team
>
> ----------------------------------------------------------------------
> --------------------------------------------
> If your query concerns a potential software error in Spectrum Scale (GPFS)
> and you have an IBM software maintenance contract please contact
> 1-800-237-5511 in the United States or your local IBM Service Center in other
> countries.
>
> [Inactive hide details for "Paul Ward" ---01/18/2022 11:56:40 AM---Hi, I am
> trying to work out what files have been sent to back]"Paul Ward"
> ---01/18/2022 11:56:40 AM---Hi, I am trying to work out what files have been
> sent to backup using mmbackup.
>
> From: "Paul Ward" <[email protected]<mailto:[email protected]>>
> To:
> "[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>"
> <[email protected]<mailto:gpfsug-discuss@spectrumscale.
> org>>
> Date: 01/18/2022 11:56 AM
> Subject: [EXTERNAL] [gpfsug-discuss] mmbackup file selections Sent by:
> [email protected]<mailto:gpfsug-discuss-bounces
> @spectrumscale.org>
>
> ________________________________
>
>
>
>
>
> Hi, I am trying to work out what files have been sent to backup using
> mmbackup. I have increased the -L value from 3 up to 6 but only seem
> to see the files that are in scope, not the ones that are selected. I can see
> the three file lists generated ZjQcmQRYFpfptBannerStart This Message Is From
> an External Sender This message came from outside your organization.
> ZjQcmQRYFpfptBannerEnd
> Hi,
>
> I am trying to work out what files have been sent to backup using mmbackup.
> I have increased the -L value from 3 up to 6 but only seem to see the files
> that are in scope, not the ones that are selected.
>
> I can see the three file lists generated during a backup, but can?t seem to
> find a list of what files were backed up.
>
> It should be the diff of the shadow and shadow-old, but the wc -l of the diff
> doesn?t match the number of files in the backup summary.
> Wrong assumption?
>
> Where should I be looking ? surely it shouldn?t be this hard to see what
> files are selected?
>
>
> Kindest regards,
> Paul
>
> Paul Ward
> TS Infrastructure Architect
> Natural History Museum
> T: 02079426450
> E: [email protected]<mailto:[email protected]>
> [A picture containing drawing Description automatically generated]
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsu
> g.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.ward%
> 40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0d4c
> 8553e1960c1%7C1%7C0%7C637786356879087616%7CUnknown%7CTWFpbGZsb3d8eyJWI
> joiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&a
> mp;sdata=72gqmRJEgZ97s3%2BjmFD12PpfcJJKUVJuyvyJf4beXS8%3D&reserved
> =0<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgp
> fsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.wa
> rd%40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0
> d4c8553e1960c1%7C1%7C0%7C637786356879087616%7CUnknown%7CTWFpbGZsb3d8ey
> JWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C200
> 0&sdata=72gqmRJEgZ97s3%2BjmFD12PpfcJJKUVJuyvyJf4beXS8%3D&reser
> ved=0>
>
>
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsu
> g.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.ward%
> 40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0d4c
> 8553e1960c1%7C1%7C0%7C637786356879087616%7CUnknown%7CTWFpbGZsb3d8eyJWI
> joiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&a
> mp;sdata=72gqmRJEgZ97s3%2BjmFD12PpfcJJKUVJuyvyJf4beXS8%3D&reserved
> =0<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgp
> fsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.wa
> rd%40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0
> d4c8553e1960c1%7C1%7C0%7C637786356879243834%7CUnknown%7CTWFpbGZsb3d8ey
> JWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C200
> 0&sdata=ng2wVGN4u37lfaRjVYe%2F7sq9AwrXTWnVIQ7iVB%2BZWuc%3D&res
> erved=0>
>
>
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsu
> g.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.ward%
> 40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0d4c
> 8553e1960c1%7C1%7C0%7C637786356879243834%7CUnknown%7CTWFpbGZsb3d8eyJWI
> joiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&a
> mp;sdata=ng2wVGN4u37lfaRjVYe%2F7sq9AwrXTWnVIQ7iVB%2BZWuc%3D&reserv
> ed=0<https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2F
> gpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.
> ward%40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437f
> a0d4c8553e1960c1%7C1%7C0%7C637786356879243834%7CUnknown%7CTWFpbGZsb3d8
> eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2
> 000&sdata=ng2wVGN4u37lfaRjVYe%2F7sq9AwrXTWnVIQ7iVB%2BZWuc%3D&r
> eserved=0>
>
>
>
> _______________________________________________
> gpfsug-discuss mailing list
> gpfsug-discuss at spectrumscale.org
> https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsu
> g.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.ward%
> 40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0d4c
> 8553e1960c1%7C1%7C0%7C637786356879243834%7CUnknown%7CTWFpbGZsb3d8eyJWI
> joiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&a
> mp;sdata=ng2wVGN4u37lfaRjVYe%2F7sq9AwrXTWnVIQ7iVB%2BZWuc%3D&reserv
> ed=0
--
-- Skylar Thompson ([email protected])
-- Genome Sciences Department (UW Medicine), System Administrator
-- Foege Building S046, (206)-685-7354
-- Pronouns: He/Him/His
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
https://eur03.safelinks.protection.outlook.com/?url=http%3A%2F%2Fgpfsug.org%2Fmailman%2Flistinfo%2Fgpfsug-discuss&data=04%7C01%7Cp.ward%40nhm.ac.uk%7Cd4c22f3c612c4cb6deb908d9df4fd706%7C73a29c014e78437fa0d4c8553e1960c1%7C1%7C0%7C637786356879243834%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=ng2wVGN4u37lfaRjVYe%2F7sq9AwrXTWnVIQ7iVB%2BZWuc%3D&reserved=0
_______________________________________________
gpfsug-discuss mailing list
gpfsug-discuss at spectrumscale.org
http://gpfsug.org/mailman/listinfo/gpfsug-discuss