steveloughran commented on PR #7700:
URL: https://github.com/apache/hadoop/pull/7700#issuecomment-3700154301

   rebased. Tested against rustfs on localhost out of curiousity, so now I know 
what to say if someone asks if it can be used: "not if you need consistent 
directory listings"
   
   1. Eventually consistently path listing on delete.
   
   Looking into the docs 
https://deepwiki.com/rustfs/rustfs/5.5-concurrency-management-and-caching#cache-coherence-and-invalidation
 comes up with "Invalidation is performed asynchronously to avoid blocking the 
write path."
   
   This is not as bad as AWS S3 used to be where a 404 was cached after a 
failing HEAD request, and listings took time to even find a newly created 
object.
   
   ```
   [ERROR] 
org.apache.hadoop.fs.contract.s3a.ITestS3AContractRootDir.testListEmptyRootDirectory
 -- Time elapsed: 0.059 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Deleted file: unexpectedly found 
s3a://rustybucket/job-00 as  S3AFileStatus{path=s3a://rustybucket/job-00; 
isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:817)
           at 
org.apache.hadoop.fs.contract.AbstractContractRootDirectoryTest.testListEmptyRootDirectory(AbstractContractRootDirectoryTest.java:195)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.performance.ITestDirectoryMarkerListing.testRenameEmptyDirOverMarker
 -- Time elapsed: 0.349 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Deleted file: unexpectedly found 
s3a://rustybucket/job-00-fork-0006/test/testRenameEmptyDirOverMarker/base/sourceDir
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0006/test/testRenameEmptyDirOverMarker/base/sourceDir;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:38)
           at org.junit.jupiter.api.Assertions.fail(Assertions.java:138)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:817)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:790)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:772)
           at 
org.apache.hadoop.fs.contract.AbstractFSContractTestBase.assertDeleted(AbstractFSContractTestBase.java:366)
           at 
org.apache.hadoop.fs.s3a.performance.ITestDirectoryMarkerListing.testRenameEmptyDirOverMarker(ITestDirectoryMarkerListing.java:529)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AEmptyDirectory.testDirectoryBecomesEmpty -- 
Time elapsed: 0.304 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Deleted file: unexpectedly found 
s3a://rustybucket/job-00-fork-0002/test/testEmptyDir/dir2 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0002/test/testEmptyDir/dir2; 
isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:38)
           at org.junit.jupiter.api.Assertions.fail(Assertions.java:138)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:817)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:790)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertDeleted(ContractTestUtils.java:772)
           at 
org.apache.hadoop.fs.contract.AbstractFSContractTestBase.assertDeleted(AbstractFSContractTestBase.java:366)
           at 
org.apache.hadoop.fs.s3a.ITestS3AEmptyDirectory.testDirectoryBecomesEmpty(ITestS3AEmptyDirectory.java:48)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [WARNING] Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 
0.288 s -- in org.apache.hadoop.fs.s3a.ITestS3AEncryptionWithDefaultS3Settings
   [ERROR] Tests run: 4, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 
4.334 s <<< FAILURE! -- in 
org.apache.hadoop.fs.s3a.performance.ITestS3ADeleteCost
   [ERROR] 
org.apache.hadoop.fs.s3a.performance.ITestS3ADeleteCost.testDirMarkersSubdir -- 
Time elapsed: 0.572 s <<< FAILURE!
   java.lang.AssertionError: Expected a java.io.FileNotFoundException to be 
thrown, but got the result: : "[]"
           at 
org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:505)
           at 
org.apache.hadoop.test.LambdaTestUtils.intercept(LambdaTestUtils.java:390)
           at 
org.apache.hadoop.fs.s3a.performance.ITestS3ADeleteCost.verifyNoListing(ITestS3ADeleteCost.java:210)
           at 
org.apache.hadoop.fs.s3a.performance.ITestS3ADeleteCost.testDirMarkersSubdir(ITestS3ADeleteCost.java:195)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFileSystemContract.testRenameDirectoryMoveToExistingDirectory
 -- Time elapsed: 0.389 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Source exists ==> expected: <false> but 
was: <true>
           at 
org.apache.hadoop.fs.FileSystemContractBaseTest.rename(FileSystemContractBaseTest.java:593)
           at 
org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:516)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFileSystemContract.testFilesystemIsCaseSensitive
 -- Time elapsed: 0.140 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: File 
existss3a://rustybucket/job-00-fork-0003/test/testfilesystemiscasesensitive ==> 
expected: <false> but was: <true>
           at 
org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:643)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFileSystemContract.testDeleteRecursively -- 
Time elapsed: 0.200 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Subdir doesn't exist ==> expected: 
<false> but was: <true>
           at 
org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:430)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [INFO] Running org.apache.hadoop.fs.s3a.performance.ITestCreateSessionTimeout
   [INFO] Running org.apache.hadoop.fs.s3a.performance.ITestS3AMkdirCost
   [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.637 
s -- in org.apache.hadoop.fs.s3a.ITestS3APrefetchingLruEviction
   [INFO] Running org.apache.hadoop.fs.s3a.ITestS3APrefetchingLruEviction
   [WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 
0.784 s -- in org.apache.hadoop.fs.s3a.performance.ITestCreateSessionTimeout
   [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.71 
s -- in org.apache.hadoop.fs.s3a.scale.ITestS3ADirectoryPerformance
   [INFO] Running org.apache.hadoop.fs.s3a.performance.ITestS3ARenameCost
   [INFO] Running org.apache.hadoop.fs.s3a.performance.ITestS3AMiscOperationCost
   [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.535 
s -- in org.apache.hadoop.fs.s3a.performance.ITestS3AOpenCost
   [ERROR] Tests run: 44, Failures: 5, Errors: 0, Skipped: 0, Time elapsed: 
8.030 s <<< FAILURE! -- in org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations.testDeleteEmptyDirectory -- 
Time elapsed: 0.127 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Dir doesn't exist ==> expected: <false> 
but was: <true>
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.testDeleteEmptyDirectory(FSMainOperationsBaseTest.java:794)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations.testRenameDirectoryAsEmptyDirectory
 -- Time elapsed: 0.346 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Source exists ==> expected: <false> but 
was: <true>
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.rename(FSMainOperationsBaseTest.java:1162)
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.testRenameDirectoryAsEmptyDirectory(FSMainOperationsBaseTest.java:1039)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations.testRenameDirectoryAsNonExistentDirectory
 -- Time elapsed: 0.195 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Source exists ==> expected: <false> but 
was: <true>
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.rename(FSMainOperationsBaseTest.java:1162)
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.doTestRenameDirectoryAsNonExistentDirectory(FSMainOperationsBaseTest.java:1007)
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.testRenameDirectoryAsNonExistentDirectory(FSMainOperationsBaseTest.java:990)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations.testGlobStatusFilterWithMultiplePathMatchesAndNonTrivialFilter
 -- Time elapsed: 0.073 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: expected: <2> but was: <0>
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.testGlobStatusFilterWithMultiplePathMatchesAndNonTrivialFilter(FSMainOperationsBaseTest.java:584)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AFSMainOperations.testDeleteRecursively -- Time 
elapsed: 0.230 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Subdir doesn't exist ==> expected: 
<false> but was: <true>
           at 
org.apache.hadoop.fs.FSMainOperationsBaseTest.testDeleteRecursively(FSMainOperationsBaseTest.java:785)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] Tests run: 10, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 
6.556 s <<< FAILURE! -- in 
org.apache.hadoop.fs.contract.s3a.ITestS3AContractRename
   [ERROR] 
org.apache.hadoop.fs.contract.s3a.ITestS3AContractRename.testRenamePopulatesDirectoryAncestors
 -- Time elapsed: 0.433 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: src path should not exist: unexpectedly 
found 
s3a://rustybucket/job-00-fork-0008/test/testRenamePopulatesDirectoryAncestors/source/dir1/dir2/dir3/dir4
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0008/test/testRenamePopulatesDirectoryAncestors/source/dir1/dir2/dir3/dir4;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.contract.AbstractFSContractTestBase.assertPathDoesNotExist(AbstractFSContractTestBase.java:322)
           at 
org.apache.hadoop.fs.contract.AbstractContractRenameTest.validateAncestorsMoved(AbstractContractRenameTest.java:290)
           at 
org.apache.hadoop.fs.contract.AbstractContractRenameTest.testRenamePopulatesDirectoryAncestors(AbstractContractRenameTest.java:257)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   ```
   
   Conclusion: listings can see recently deleted files.
   
   2. list multipart upload operation may not work *or is inconsistent*.
   
   ListMultipartUploads operation sometimes returns an empty list even when 
MPUs are in progress, sometimes returns values when expected to be empty.
   
   
   ```
   
   [ERROR] 
org.apache.hadoop.fs.contract.s3a.ITestS3AContractMultipartUploader.testAbortAllPendingUploads
 -- Time elapsed: 0.249 s <<< FAILURE!
   java.lang.AssertionError: 
   [Number of uploads aborted] 
   Expecting:
    <0>
   to be greater than or equal to:
    <1> 
           at 
org.apache.hadoop.fs.contract.AbstractContractMultipartUploaderTest.testAbortAllPendingUploads(AbstractContractMultipartUploaderTest.java:670)
   
   
   [ERROR] Tests run: 14, Failures: 2, Errors: 0, Skipped: 3, Time elapsed: 
4.687 s <<< FAILURE! -- in org.apache.hadoop.fs.s3a.s3guard.ITestS3GuardTool
   [ERROR] 
org.apache.hadoop.fs.s3a.s3guard.ITestS3GuardTool.testUploadListByAge -- Time 
elapsed: 0.160 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Should be one upload ==> expected: <1> 
but was: <0>
           at 
org.junit.jupiter.api.AssertionFailureBuilder.build(AssertionFailureBuilder.java:151)
           at 
org.junit.jupiter.api.AssertionFailureBuilder.buildAndThrow(AssertionFailureBuilder.java:132)
           at 
org.junit.jupiter.api.AssertEquals.failNotEqual(AssertEquals.java:197)
           at 
org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:150)
           at org.junit.jupiter.api.Assertions.assertEquals(Assertions.java:563)
           at 
org.apache.hadoop.fs.s3a.s3guard.ITestS3GuardTool.testUploadListByAge(ITestS3GuardTool.java:201)
   
   
   [ERROR] org.apache.hadoop.fs.s3a.s3guard.ITestS3GuardTool.testUploads -- 
Time elapsed: 0.172 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: Should be one upload ==> expected: <1> 
but was: <0>
           at 
org.junit.jupiter.api.AssertionFailureBuilder.build(AssertionFailureBuilder.java:151)
           at 
org.junit.jupiter.api.AssertionFailureBuilder.buildAndThrow(AssertionFailureBuilder.java:132)
           at 
org.junit.jupiter.api.AssertEquals.failNotEqual(AssertEquals.java:197)
           at 
org.junit.jupiter.api.AssertEquals.assertEquals(AssertEquals.java:150)
           at org.junit.jupiter.api.Assertions.assertEquals(Assertions.java:563)
           at 
org.apache.hadoop.fs.s3a.s3guard.ITestS3GuardTool.testUploads(ITestS3GuardTool.java:155)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestPartitionedCommitProtocol.testCommitLifecycle
 -- Time elapsed: 1.057 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0002/test/ITestPartitionedCommitProtocol-testCommitLifecycle
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testCommitLifecycle(AbstractITCommitProtocol.java:849)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestPartitionedCommitProtocol.testParallelJobsToAdjacentPaths
 -- Time elapsed: 1.191 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0002/test/ITestPartitionedCommitProtocol-testParallelJobsToAdjacentPaths
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testParallelJobsToAdjacentPaths(AbstractITCommitProtocol.java:1554)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] Tests run: 24, Failures: 2, Errors: 0, Skipped: 1, Time elapsed: 
22.62 s <<< FAILURE! -- in 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestStagingCommitProtocol
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestStagingCommitProtocol.testCommitLifecycle
 -- Time elapsed: 0.832 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0003/test/ITestStagingCommitProtocol-testCommitLifecycle
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testCommitLifecycle(AbstractITCommitProtocol.java:849)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestStagingCommitProtocol.testParallelJobsToAdjacentPaths
 -- Time elapsed: 1.448 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0003/test/ITestStagingCommitProtocol-testParallelJobsToAdjacentPaths
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testParallelJobsToAdjacentPaths(AbstractITCommitProtocol.java:1554)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [WARNING] Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 
0.703 s -- in org.apache.hadoop.fs.s3a.commit.ITestUploadRecovery
   [INFO] Running org.apache.hadoop.fs.s3a.commit.ITestUploadRecovery
   [INFO] Running 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations
   [WARNING] Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 
0.723 s -- in org.apache.hadoop.fs.s3a.commit.ITestUploadRecovery
   [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.923 
s -- in org.apache.hadoop.fs.s3a.commit.ITestUploadRecovery
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestS3APutIfMatchAndIfNoneMatch
   [ERROR] Tests run: 25, Failures: 2, Errors: 0, Skipped: 1, Time elapsed: 
23.56 s <<< FAILURE! -- in 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestDirectoryCommitProtocol
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestDirectoryCommitProtocol.testCommitLifecycle
 -- Time elapsed: 0.796 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0005/test/ITestDirectoryCommitProtocol-testCommitLifecycle
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testCommitLifecycle(AbstractITCommitProtocol.java:849)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.staging.integration.ITestDirectoryCommitProtocol.testParallelJobsToAdjacentPaths
 -- Time elapsed: 1.421 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0005/test/ITestDirectoryCommitProtocol-testParallelJobsToAdjacentPaths
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testParallelJobsToAdjacentPaths(AbstractITCommitProtocol.java:1554)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestTreewalkProblems
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestStoreClose
   [WARNING] Tests run: 3, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 
3.562 s -- in org.apache.hadoop.fs.s3a.impl.ITestS3AConditionalCreateBehavior
   [INFO] Running 
org.apache.hadoop.fs.s3a.impl.ITestS3AConditionalCreateBehavior
   [ERROR] Tests run: 2, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 
4.074 s <<< FAILURE! -- in 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations
   [ERROR] 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.testRenameWithPendingUpload
 -- Time elapsed: 3.464 s <<< FAILURE!
   java.lang.AssertionError: 
   [uploads under 
s3a://rustybucket/job-00-fork-0002/test/testRenameWithPendingUpload/src] 
   Expected size:<1> but was:<0> in:
   <[]>
           at 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.assertUploadCount(ITestUploadPurgeOnDirectoryOperations.java:133)
           at 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.testRenameWithPendingUpload(ITestUploadPurgeOnDirectoryOperations.java:111)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.testDeleteWithPendingUpload
 -- Time elapsed: 0.488 s <<< FAILURE!
   java.lang.AssertionError: 
   [uploads under 
s3a://rustybucket/job-00-fork-0002/test/testDeleteWithPendingUpload] 
   Expected size:<1> but was:<0> in:
   <[]>
           at 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.assertUploadCount(ITestUploadPurgeOnDirectoryOperations.java:133)
           at 
org.apache.hadoop.fs.s3a.impl.ITestUploadPurgeOnDirectoryOperations.testDeleteWithPendingUpload(ITestUploadPurgeOnDirectoryOperations.java:86)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.868 
s -- in org.apache.hadoop.fs.s3a.impl.ITestS3AConditionalCreateBehavior
   [INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.564 
s -- in org.apache.hadoop.fs.s3a.impl.ITestS3AConditionalCreateBehavior
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestConnectionTimeouts
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestPartialRenamesDeletes
   [INFO] Running org.apache.hadoop.fs.s3a.impl.ITestPartialRenamesDeletes
   [WARNING] Tests run: 15, Failures: 0, Errors: 0, Skipped: 15, Time elapsed: 
6.325 s -- in org.apache.hadoop.fs.s3a.impl.ITestS3APutIfMatchAndIfNoneMatch
   [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.289 
s -- in org.apache.hadoop.fs.s3a.impl.ITestStoreClose
   [ERROR] Tests run: 24, Failures: 6, Errors: 0, Skipped: 1, Time elapsed: 
27.16 s <<< FAILURE! -- in 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testCommitLifecycle
 -- Time elapsed: 0.845 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testCommitLifecycle
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testCommitLifecycle(AbstractITCommitProtocol.java:849)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testAbortTaskThenJob
 -- Time elapsed: 0.830 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: magic dir : unexpectedly found 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testAbortTaskThenJob/__magic_job-job_202512290159_0008
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testAbortTaskThenJob/__magic_job-job_202512290159_0008;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:38)
           at org.junit.jupiter.api.Assertions.fail(Assertions.java:138)
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.assertJobAbortCleanedUp(ITestMagicCommitProtocol.java:112)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testAbortTaskThenJob(AbstractITCommitProtocol.java:1223)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testParallelJobsToAdjacentPaths
 -- Time elapsed: 1.270 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: No multipart uploads in progress under 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testParallelJobsToAdjacentPaths
 ==> expected: <true> but was: <false>
           at 
org.apache.hadoop.fs.s3a.commit.AbstractCommitITest.assertMultipartUploadsPending(AbstractCommitITest.java:238)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testParallelJobsToAdjacentPaths(AbstractITCommitProtocol.java:1554)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testRecoveryAndCleanup
 -- Time elapsed: 0.811 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: task attempt dir: unexpectedly found 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testRecoveryAndCleanup/__magic_job-job_202512290106_0008/job-job_202512290106_0008/01/tasks/attempt_202512290106_0008_m_000000_0/__base
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testRecoveryAndCleanup/__magic_job-job_202512290106_0008/job-job_202512290106_0008/01/tasks/attempt_202512290106_0008_m_000000_0/__base;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.assertTaskAttemptPathDoesNotExist(AbstractITCommitProtocol.java:695)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testRecoveryAndCleanup(AbstractITCommitProtocol.java:655)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testAbortJobNotTask
 -- Time elapsed: 0.612 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: task attempt dir: unexpectedly found 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testAbortJobNotTask/__magic_job-job_202512290879_0008/job-job_202512290879_0008/01/tasks/attempt_202512290879_0008_m_000000_0/__base
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testAbortJobNotTask/__magic_job-job_202512290879_0008/job-job_202512290879_0008/01/tasks/attempt_202512290879_0008_m_000000_0/__base;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.assertTaskAttemptPathDoesNotExist(AbstractITCommitProtocol.java:695)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.lambda$testAbortJobNotTask$11(AbstractITCommitProtocol.java:1294)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.executeWork(AbstractITCommitProtocol.java:624)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.executeWork(AbstractITCommitProtocol.java:608)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testAbortJobNotTask(AbstractITCommitProtocol.java:1289)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   [ERROR] 
org.apache.hadoop.fs.s3a.commit.magic.ITestMagicCommitProtocol.testCommitterWithNoOutputs
 -- Time elapsed: 0.752 s <<< FAILURE!
   org.opentest4j.AssertionFailedError: task attempt dir: unexpectedly found 
s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testCommitterWithNoOutputs/__magic_job-job_202512290330_0008/job-job_202512290330_0008/01/tasks/attempt_202512290330_0008_m_000000_0/__base
 as  
S3AFileStatus{path=s3a://rustybucket/job-00-fork-0008/test/ITestMagicCommitProtocol-testCommitterWithNoOutputs/__magic_job-job_202512290330_0008/job-job_202512290330_0008/01/tasks/attempt_202512290330_0008_m_000000_0/__base;
 isDirectory=true; modification_time=0; access_time=0; owner=stevel; 
group=stevel; permission=rwxrwxrwx; isSymlink=false; hasAcl=false; 
isEncrypted=true; isErasureCoded=false} isEmptyDirectory=FALSE eTag=null 
versionId=null
           at 
org.apache.hadoop.fs.contract.ContractTestUtils.assertPathDoesNotExist(ContractTestUtils.java:1088)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.assertTaskAttemptPathDoesNotExist(AbstractITCommitProtocol.java:695)
           at 
org.apache.hadoop.fs.s3a.commit.AbstractITCommitProtocol.testCommitterWithNoOutputs(AbstractITCommitProtocol.java:1080)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.Optional.ifPresent(Optional.java:178)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:992)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at 
java.base/java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:276)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
           at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625)
           at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
           at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
           at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
           at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
           at 
java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   ```
   
   all of these are failures saying 
   ```
   
   [ERROR] 
org.apache.hadoop.fs.s3a.impl.ITestS3AStoreWriterService.testListMultipartUploads
 -- Time elapsed: 4.347 s <<< FAILURE!
   java.lang.AssertionError: 
   [Uploads which we expected to be listed.] 
   Expecting empty but was:<[IdKey{key='job-00-fork-0006/test/pending-part-0', 
uploadId='Mzk1NmVjMDYtYzAxOS00N2NjLWI0OGQtM2ExNTcxN2ExN2QyLjNlYmI5ZWJhLWRlZWEtNDVhYi05MDJlLTA1NjU0MWVjMmUzMXgxNzY3MDMyOTA4MjMyOTEzMDAw'},
       IdKey{key='job-00-fork-0006/test/pending-part-1', 
uploadId='Mzk1NmVjMDYtYzAxOS00N2NjLWI0OGQtM2ExNTcxN2ExN2QyLjk4ODA3Nzk2LWY2NGMtNDQ0NC04OWEzLTNjNWQ1YTEwZmI3NHgxNzY3MDMyOTA4Mjg2NDMxMDAw'},
       IdKey{key='job-00-fork-0006/test/pending-part-2', 
uploadId='Mzk1NmVjMDYtYzAxOS00N2NjLWI0OGQtM2ExNTcxN2ExN2QyLjI3ZGYxODMwLWM3YjEtNGYzZC05MjAzLTdiNzI5MjdmZGNmMXgxNzY3MDMyOTA4MzA2OTQwMDAw'},
       IdKey{key='job-00-fork-0006/test/pending-part-4', 
uploadId='Mzk1NmVjMDYtYzAxOS00N2NjLWI0OGQtM2ExNTcxN2ExN2QyLjQ5MDQ3Njg0LWU5NTAtNGY0NC1hZDM1LTU4ZDk2ZjUwOTZjM3gxNzY3MDMyOTA4NTE5MjE5MDAw'},
       IdKey{key='job-00-fork-0006/test/pending-part-3', 
uploadId='Mzk1NmVjMDYtYzAxOS00N2NjLWI0OGQtM2ExNTcxN2ExN2QyLmJkZGU5Mjg1LTQ1MDUtNGM4NC04OTVkLWU1YWM3MTUxMGVjNngxNzY3MDMyOTA4NDQ4ODAyMDAw'}]>
           at 
org.apache.hadoop.fs.s3a.impl.ITestS3AStoreWriterService.assertUploadsPresent(ITestS3AStoreWriterService.java:131)
           at 
org.apache.hadoop.fs.s3a.impl.ITestS3AStoreWriterService.testListMultipartUploads(ITestS3AStoreWriterService.java:95)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   
   
   ```
   
   
   
   3. filesystem is case insensitive, at least on a mac
   
   ```
   [ERROR]   
ITestS3AFileSystemContract>FileSystemContractBaseTest.testFilesystemIsCaseSensitive:643
 File exists  
s3a://rustybucket/job-00-fork-0003/test/testfilesystemiscasesensitive ==> 
expected: <false> but was: <true> 
   ```
   4. getBucketMetadata operation unsupported; returns 501.
   
   ```
   [ERROR] 
org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion.testWithoutRegionConfig -- Time 
elapsed: 0.091 s <<< ERROR!
   org.apache.hadoop.fs.s3a.AWSUnsupportedFeatureException: getBucketMetadata() 
on rustybucket: software.amazon.awssdk.services.s3.model.S3Exception: Not 
Implemented (Service: S3, Status Code: 501, Request ID: null):null: Not 
Implemented (Service: S3, Status Code: 501, Request ID: null)
           at 
org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:311)
           at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:124)
           at org.apache.hadoop.fs.s3a.Invoker.lambda$retry$4(Invoker.java:376)
           at 
org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:468)
           at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:372)
           at org.apache.hadoop.fs.s3a.Invoker.retry(Invoker.java:347)
           at 
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getBucketMetadata$10(S3AFileSystem.java:2983)
           at 
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.invokeTrackingDuration(IOStatisticsBinding.java:546)
           at 
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:527)
           at 
org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:448)
           at 
org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2806)
           at 
org.apache.hadoop.fs.s3a.S3AFileSystem.getBucketMetadata(S3AFileSystem.java:2982)
           at 
org.apache.hadoop.fs.s3a.S3AFileSystem$S3AInternalsImpl.getBucketMetadata(S3AFileSystem.java:1603)
           at 
org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion.testWithoutRegionConfig(ITestS3AEndpointRegion.java:140)
           at java.base/java.lang.reflect.Method.invoke(Method.java:568)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
           at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
   Caused by: software.amazon.awssdk.services.s3.model.S3Exception: Not 
Implemented (Service: S3, Status Code: 501, Request ID: null)
           at 
software.amazon.awssdk.services.s3.model.S3Exception$BuilderImpl.build(S3Exception.java:113)
           at 
software.amazon.awssdk.services.s3.model.S3Exception$BuilderImpl.build(S3Exception.java:61)
           at 
software.amazon.awssdk.services.s3.internal.handlers.ExceptionTranslationInterceptor.modifyException(ExceptionTranslationInterceptor.java:88)
           at 
software.amazon.awssdk.core.interceptor.ExecutionInterceptorChain.modifyException(ExecutionInterceptorChain.java:181)
           at 
software.amazon.awssdk.core.internal.http.pipeline.stages.utils.ExceptionReportingUtils.runModifyException(ExceptionReportingUtils.java:54)
           at 
software.amazon.awssdk.core.internal.http.pipeline.stages.utils.ExceptionReportingUtils.reportFailureToInterceptors(ExceptionReportingUtils.java:38)
           at 
software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:39)
           at 
software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:26)
           at 
software.amazon.awssdk.core.internal.http.AmazonSyncHttpClient$RequestExecutionBuilderImpl.execute(AmazonSyncHttpClient.java:210)
           at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.invoke(BaseSyncClientHandler.java:103)
           at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.doExecute(BaseSyncClientHandler.java:173)
           at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$1(BaseSyncClientHandler.java:80)
           at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:182)
           at 
software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:74)
           at 
software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:45)
           at 
software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:53)
           at 
software.amazon.awssdk.services.s3.DefaultS3Client.headBucket(DefaultS3Client.java:7393)
           at 
software.amazon.awssdk.services.s3.DelegatingS3Client.lambda$headBucket$58(DelegatingS3Client.java:5969)
           at 
software.amazon.awssdk.services.s3.internal.crossregion.S3CrossRegionSyncClient.invokeOperation(S3CrossRegionSyncClient.java:67)
           at 
software.amazon.awssdk.services.s3.DelegatingS3Client.headBucket(DelegatingS3Client.java:5969)
           at 
org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getBucketMetadata$9(S3AFileSystem.java:2985)
           at org.apache.hadoop.fs.s3a.Invoker.once(Invoker.java:122)
           ... 15 more
   ```
   
   trivial and not unusual with third party stores.
   
   Overall, bit of a mess at least as far as testing goes.
   
   I would avoid using anywhere you require case sensitivitiy or list 
consistency, and would want to spend more time exploring the multipart test 
failures to be confident you can upload objects in sizes of many GB. That 
listing inconsistency brings back bad memories of the old S3, when hive and 
spark queries could miss data.
   
   I would absolutely avoid using any of the s3a committers against it without 
understanding what is up with MPU listings.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to