[ 
https://issues.apache.org/jira/browse/HADOOP-14821?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16152716#comment-16152716
 ] 

Steve Loughran commented on HADOOP-14821:
-----------------------------------------

Aaron. that looks like an HDP version; fixed the numbers for you

This is an interesting policy to think about.

# it's clearly an error if you specify a credential provider and your 
credentials can't be loaded from it
# Is it an error if one of the providers fails & they are found elsewhere? As 
this JIRA says "postpone the failure until the end"
# But what about the case: provider load fails and there is some password in 
the hadoop-config file. Should that fall back too?

+ [~lmc...@apache.org] for his opinion

> Executing the command 'hdfs 
> -Dhadoop.security.credential.provider.path=file1.jceks,file2.jceks' fails if 
> permission is denied to some files
> -------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14821
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14821
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs/s3, hdfs-client, security
>    Affects Versions: 2.8.0
>            Reporter: Ernani Pereira de Mattos Junior
>            Priority: Critical
>              Labels: features
>
> ======= 
> Request Use Case: 
> UC1: 
> The customer has the path to a directory and subdirectories full of keys. The 
> customer knows that he does not have the access to all the keys, but ignoring 
> this problem, the customer makes a list of the keys. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is granted locally then he can try the login on the s3a. 
> UC1.2: 
> The customer in a FIFO manner, try his access to the key provided on the 
> list. If the access is not granted locally then he will skip the login on the 
> s3a and try the next key on the list. 
> ===========
> For now, the UC1.2 fails with below exception and does not try the next key:
> {code}
> $ hdfs  --loglevel DEBUG dfs 
> -Dhadoop.security.credential.provider.path=jceks://hdfs/tmp/aws.jceks,jceks://hdfs/tmp/awst.jceks
>  -ls s3a://av-dl-hwx-nprod-anhffpoc-enriched/hive/e_ceod/
> Not retrying because try once and fail.
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
>  Permission denied: user=502549376, access=READ, 
> inode="/tmp/aws.jceks":admin:hdfs:-rwx------
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to