[ 
https://issues.apache.org/jira/browse/HADOOP-17372?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17545824#comment-17545824
 ] 

Steve Loughran commented on HADOOP-17372:
-----------------------------------------

hmm. i don't want to get into second guess spark classloader games. too much 
risk of weirdness happening.

we get a lot of escalations of basic s3a functionality not work when a 
HiveConfiguration is passed in, aws sdk provider referenced etc etc. You are 
into advanced use and it's out of scope. sorry.

do try packaging the provider into an org.apache.hadoop or org.apache.spark sub 
package and see if that makes things go away. 

> S3A AWS Credential provider loading gets confused with isolated classloaders
> ----------------------------------------------------------------------------
>
>                 Key: HADOOP-17372
>                 URL: https://issues.apache.org/jira/browse/HADOOP-17372
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.4.0
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Major
>             Fix For: 3.3.1
>
>
> Problem: exception in loading S3A credentials for an FS, "Class class 
> com.amazonaws.auth.EnvironmentVariableCredentialsProvider does not implement 
> AWSCredentialsProvider"
> Location: S3A + Spark dataframes test
> Hypothesised cause:
> Configuration.getClasses() uses the context classloader, and with the spark 
> isolated CL that's different from the one the s3a FS uses, so it can't load 
> AWS credential providers.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to