[ 
https://issues.apache.org/jira/browse/HADOOP-11478?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14293894#comment-14293894
 ] 

Charles Lamb commented on HADOOP-11478:
---------------------------------------

[~ranadip],

I assume that when you configured your kms acls per the instructions in 
HADOOP-11479 that this problem went away. Feel free to reopen if that's not the 
case.

Charles


> HttpFSServer does not properly impersonate a real user when executing "open" 
> operation in a kerberised environment
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-11478
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11478
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.6.0
>         Environment: CentOS
>            Reporter: Ranadip
>            Priority: Blocker
>
> Setup:
> - Kerberos enabled in the cluster, including Hue SSO
> - Encryption enabled using KMS. Encryption key and encryption zone created. 
> KMS key level ACL created to allow only real user to have all access to the 
> key and no one else.
> Manifestation:
> Using Hue, real user logged in using Kerberos credentials. For direct access, 
> user does kinit and then uses curl calls.
> New file creation inside encryption zone goes ahead fine as expected. 
> But attempts to view the contents of the file fails with exception:
> "User [httpfs] is not authorized to perform [DECRYPT_EEK] on key with ACL 
> name [mykeyname]!!"
> Perhaps, this is linked to bug #HDFS-6849. In the file HttpFSServer.java, the 
> OPEN handler calls command.execute(fs) directly (and this fails). In CREATE, 
> that call is wrapped within fsExecute(user, command). Apparently, this seems 
> to cause the problem.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to