[ 
https://issues.apache.org/jira/browse/SPARK-11695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15321539#comment-15321539
 ] 

Steve Loughran commented on SPARK-11695:
----------------------------------------

There's some interesting ramifications to this code: it means that if the env 
vars are set then they overwrite any value in core-default.xml. It's also going 
to slightly complicate the workings of HADOOP-12807; now that the AWS env vars 
are being picked up, there's a whole set of config options which ought to be 
handled together. The session token is the big one. If that var is set, then 
fixing up the fs.s3a things will stop operations working.

http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html#cli-environment




> Set s3a credentials by default similarly to s3 and s3n
> ------------------------------------------------------
>
>                 Key: SPARK-11695
>                 URL: https://issues.apache.org/jira/browse/SPARK-11695
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Chris Bannister
>            Assignee: Chris Bannister
>            Priority: Trivial
>             Fix For: 1.6.0
>
>
> When creating a new hadoop configuration Spark sets s3 and s3n credentials if 
> the environment variables are set, it should also add s3a.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to