[jira] [Comment Edited] (HADOOP-17372) S3A AWS Credential provider loading gets confused with isolated classloaders
[ https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17789547#comment-17789547 ] Steve Loughran edited comment on HADOOP-17372 at 11/25/23 12:25 PM: well, an option to disable isolated classloader is something which could be added, just think of a test + docs. was (Author: ste...@apache.org): well, an option to disable isolated classloader is something which could be added, just think of a test > S3A AWS Credential provider loading gets confused with isolated classloaders > > > Key: HADOOP-17372 > URL: https://issues.apache.org/jira/browse/HADOOP-17372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Assignee: Steve Loughran >Priority: Major > Fix For: 3.3.1 > > > Problem: exception in loading S3A credentials for an FS, "Class class > com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement > AWSCredentialsProvider" > Location: S3A + Spark dataframes test > Hypothesised cause: > Configuration.getClasses() uses the context classloader, and with the spark > isolated CL that's different from the one the s3a FS uses, so it can't load > AWS credential providers. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org
[jira] [Comment Edited] (HADOOP-17372) S3A AWS Credential provider loading gets confused with isolated classloaders
[ https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17229527#comment-17229527 ] Steve Loughran edited comment on HADOOP-17372 at 11/10/20, 9:38 PM: setting com.amazonaws in "spark.sql.hive.metastore.sharedPrefixes" makes this go away At the same time, I think we could move the env variables provider to something under o.a.h so that by default everything works...you'd only need to play with this setting in the specific case "you are doing custom plugin stuff" was (Author: ste...@apache.org): setting com.aws in "spark.sql.hive.metastore.sharedPrefixes" should be enough. > S3A AWS Credential provider loading gets confused with isolated classloaders > > > Key: HADOOP-17372 > URL: https://issues.apache.org/jira/browse/HADOOP-17372 > Project: Hadoop Common > Issue Type: Sub-task > Components: fs/s3 >Affects Versions: 3.4.0 >Reporter: Steve Loughran >Priority: Major > > Problem: exception in loading S3A credentials for an FS, "Class class > com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement > AWSCredentialsProvider" > Location: S3A + Spark dataframes test > Hypothesised cause: > Configuration.getClasses() uses the context classloader, and with the spark > isolated CL that's different from the one the s3a FS uses, so it can't load > AWS credential providers. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org For additional commands, e-mail: common-issues-h...@hadoop.apache.org