On Jun 16, 2022, at 11:54 AM, Eric Lease Morgan <[email protected]> wrote:
> I need help uncompressing a tar file...
I believe I have resolved this problem.
Indeed, I believe AFS was getting confused by both the number of files and the
lengths of the file names. To overcome this problem, I:
1. copied the tar file to my (Macintosh) desktop
2. uncompressed it
3. used the attached Perl script to subdivide the desired
files into subdirectories (because the number of files
is too big and the file names are too long for single
AFS directory entries)
4. zipped up the subdirectories
5. copied the zip file back to the AFS space
6. uncompressed the whole
All of this is good enough for me; fun with high performance computing.
By the way, my Macintosh successfully uncompressed the archive, but the
resulting file system was seemingly created as read-only. Weird, but
functional. Maybe it was doing me a favor?
--
Eric Morgan
University of Notre Dame
#!/usr/bin/env perl
# subdivide.pl - given a few configurations, divide a directory into many smaller ones
# see: https://stackoverflow.com/questions/36511098/split-large-directory-into-subdirectories
# configure
my $BASE = "json";
my $ROOT = './jsons';
my $LIMIT = 10000;
my $i = 0;
my $directory = '';
# require
use strict;
use File::Copy;
# open the directory and process each file
opendir( my $handle, $BASE ) or die( "Can't open $BASE: $!\n");
while ( my $file = readdir( $handle ) ) {
# sanity check
next if $file !~ /json$/;
warn( " file: $file\n" );
# check for new directory needed
if ( $i % $LIMIT == 0 ) {
# create a new directory
$directory = sprintf( "%s/json_%03d", $ROOT, ( int( $i/$LIMIT ) + 1 ) );
warn( " directory: $directory\n" );
mkdir( $directory ) or warn( "Can't make directory $directory: $!\n" );
}
# copy the given file to the new directory
copy( "$BASE/$file", "$directory/$file" ) or do {
warn("Can't move $file into $directory: $!\n");
next;
};
# increment
$i++;
}