[ https://issues.apache.org/jira/browse/HADOOP-16806?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17584335#comment-17584335 ]
ASF GitHub Bot commented on HADOOP-16806: ----------------------------------------- jmahonin commented on PR #4753: URL: https://github.com/apache/hadoop/pull/4753#issuecomment-1225922219 Specifying `us-west-2` for both landsat-pds and usgs-landsat allowed my tests to pass with my the various region settings in `auth-keys.xml` set to `us-east-1`. There is an issue with `ITestS3ABucketExistence#testAccessPointProbingV2,testAccessPointRequired` with another region enabled. ``` [ERROR] testAccessPointProbingV2(org.apache.hadoop.fs.s3a.ITestS3ABucketExistence) Time elapsed: 10.476 s <<< ERROR! java.lang.IllegalArgumentException: The region field of the ARN being passed as a bucket parameter to an S3 operation does not match the region the client was configured with. Provided region: 'eu-west-1'; client region: 'us-east-1'. at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.lambda$testAccessPointProbingV2$12(ITestS3ABucketExistence.java:172) at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.expectUnknownStore(ITestS3ABucketExistence.java:103) at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.testAccessPointProbingV2(ITestS3ABucketExistence.java:171) [ERROR] testAccessPointRequired(org.apache.hadoop.fs.s3a.ITestS3ABucketExistence) Time elapsed: 0.889 s <<< ERROR! java.lang.IllegalArgumentException: The region field of the ARN being passed as a bucket parameter to an S3 operation does not match the region the client was configured with. Provided region: 'eu-west-1'; client region: 'us-east-1'. at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.lambda$testAccessPointRequired$14(ITestS3ABucketExistence.java:189) at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.expectUnknownStore(ITestS3ABucketExistence.java:103) at org.apache.hadoop.fs.s3a.ITestS3ABucketExistence.testAccessPointRequired(ITestS3ABucketExistence.java:188) ``` Commenting out my region settings allows these tests to pass, however. > AWS AssumedRoleCredentialProvider needs ExternalId add > ------------------------------------------------------ > > Key: HADOOP-16806 > URL: https://issues.apache.org/jira/browse/HADOOP-16806 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 > Affects Versions: 3.2.1 > Reporter: Jon Hartlaub > Priority: Minor > Labels: pull-request-available > > AWS has added a security feature to the assume-role function in the form of > the "ExternalId" key in the AWS Java SDK > {{STSAssumeRoleSessionCredentialsProvider.Builder}} class. To support this > security feature, the hadoop aws {{AssumedRoleCredentialProvider}} needs a > patch to include this value from the configuration as well as an added > Constant to the {{org.apache.hadoop.fs.s3a.Constants}} file. > The ExternalId is not a required security feature, it is an augmentation of > the current assume role configuration. > Proposed: > * Get the assume-role ExternalId token from the configuration for the > configuration key {{fs.s3a.assumed.role.externalid}} > * Use the configured ExternalId value in the > {{STSAssumeRoleSessionCredentialsProvider.Builder}} > e.g. > {{if (StringUtils.isNotEmpty(externalId)) {}} > {{ builder.withExternalId(externalId); // include the token for > cross-account assume role}} > {{}}} > Tests: > * +Unit test+ which verifies the ExternalId state value of the > {{AssumedRoleCredentialProvider}} is consistent with the configured value - > either empty or populated > * Question: not sure about how to write the +integration test+ for this > feature. We have an account configured for this use-case that verifies this > feature but I don't have much context on the Hadoop project AWS S3 > integration tests, perhaps a pointer could help. > > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org