[ 
https://issues.apache.org/jira/browse/HDFS-14091?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16694127#comment-16694127
 ] 

Brahma Reddy Battula edited comment on HDFS-14091 at 11/21/18 3:27 AM:
-----------------------------------------------------------------------

[~RANith] thanks for reporting. I planned to do this under HDFS-13655,let do 
there

and this will not blocker since "dfs.encrypt.data.transfer" default value will 
be "false".( these will be enabled for data encryption)

Coming to Patch,Encryption key based on the BPID so I feel, we need to get from 
all the namespaces and return to requessted namespace.


was (Author: brahmareddy):
[~RANith] thanks for reporting. I planned to do this under HDFS-13655.

and this will not blocker since "dfs.encrypt.data.transfer" default value will 
be "false".( these will be enabled for data encryption)

Coming to Patch,Encryption key based on the BPID so I feel, we need to get from 
all the namespaces and return to requessted namespace.

> RBF: File Read and Writing is failing when security is enabled.
> ---------------------------------------------------------------
>
>                 Key: HDFS-14091
>                 URL: https://issues.apache.org/jira/browse/HDFS-14091
>             Project: Hadoop HDFS
>          Issue Type: Sub-task
>    Affects Versions: HDFS-13532
>            Reporter: Ranith Sardar
>            Assignee: Ranith Sardar
>            Priority: Blocker
>         Attachments: HDFS-14091.001.patch
>
>
> 2018-11-20 14:20:53,127 INFO hdfs.DataStreamer: Exception in 
> createBlockOutputStream blk_1073741872_1048
> org.apache.hadoop.ipc.RemoteException(java.lang.UnsupportedOperationException):
>  Operation "getDataEncryptionKey" is not supported
>  at 
> org.apache.hadoop.hdfs.server.federation.router.RouterRpcServer.checkOperation(RouterRpcServer.java:436)
>  at 
> org.apache.hadoop.hdfs.server.federation.router.RouterRpcServer.getDataEncryptionKey(RouterRpcServer.java:1965)
>  at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDataEncryptionKey(ClientNamenodeProtocolServerSideTranslatorPB.java:1214)
>  at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>  at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
>  at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991)
>  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:878)
>  at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:824)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
>  at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2684)
>  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1520)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1466)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1376)
>  at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
>  at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
>  at com.sun.proxy.$Proxy11.getDataEncryptionKey(Unknown Source)
>  at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDataEncryptionKey(ClientNamenodeProtocolTranslatorPB.java:1133)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:497)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>  at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
>  at com.sun.proxy.$Proxy12.getDataEncryptionKey(Unknown Source)
>  at org.apache.hadoop.hdfs.DFSClient.newDataEncryptionKey(DFSClient.java:1824)
>  at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:214)
>  at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183)
>  at 
> org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1795)
>  at 
> org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1743)
>  at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:718)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-issues-h...@hadoop.apache.org

Reply via email to