[ 
https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15954233#comment-15954233
 ] 

Steve Loughran commented on SPARK-20153:
----------------------------------------

This is fixed in Hadoop 2.8 with [per-bucket 
configuration|http://hadoop.apache.org/docs/r2.8.0/hadoop-aws/tools/hadoop-aws/index.html#Configurations_different_S3_buckets];
 HADOOP-13336. 

I would *really* advise against trying to re-implement this in spark as having 
one consistent model for configuring s3a bindings everywhere will the only way 
to debug what's going on, especially given that for security reasons you can't 
log what's going on.

As a temporary workaround, one which will leak your secrets to logs, know that 
you can go s3a://key:secret@bucket, URL encoding the secret.

> Support Multiple aws credentials in order to access multiple Hive on S3 table 
> in spark application 
> ---------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-20153
>                 URL: https://issues.apache.org/jira/browse/SPARK-20153
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.0.1, 2.1.0
>            Reporter: Franck Tago
>            Priority: Minor
>
> I need to access multiple hive tables in my spark application where each hive 
> table is 
> 1- an external table with data sitting on S3
> 2- each table is own by a different AWS user so I need to provide different 
> AWS credentials. 
> I am familiar with setting the aws credentials in the hadoop configuration 
> object but that does not really help me because I can only set one pair of 
> (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
> From my research , there is no easy or elegant way to do this in spark .
> Why is that ?  
> How do I address this use case?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to