[jira] [Commented] (HDFS-9878) Data transfer encryption with AES 192: Invalid key length.

2016-10-06 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-9878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15551661#comment-15551661
 ] 

ASF GitHub Bot commented on HDFS-9878:
--

GitHub user QwertyManiac opened a pull request:

https://github.com/apache/hadoop/pull/135

HDFS-9878. Add support for AES-192 in OpensslCipher.

- Adds equivalent support for 192-bit AES/CTR/NoPadding codec in the 
OpensslCipher
- Adds test-cases to cover 192-bit (24-bytes) and 256-bit (32-bytes) keys 
to both JCE and OpenSSL crypto tests
- Enhances the error message when an invalid Key or IV size is passed into 
OpensslCipher

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/QwertyManiac/hadoop HDFS-9878

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/hadoop/pull/135.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #135


commit 4e08bafaec31f0190f731f6ca0e07c7f2125d489
Author: Harsh J 
Date:   2016-10-06T05:52:22Z

HDFS-9878. Add support for AES-192 in OpenSSL native Cipher.




> Data transfer encryption with AES 192: Invalid key length.
> --
>
> Key: HDFS-9878
> URL: https://issues.apache.org/jira/browse/HDFS-9878
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: datanode, hdfs-client
>Affects Versions: 2.7.2
> Environment: OS: Ubuntu 14.04
> /hadoop-2.7.2/bin$ uname -a
> Linux wkstn-kpalaniappan 3.13.0-79-generic #123-Ubuntu SMP Fri Feb 19 
> 14:27:58 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
> /hadoop-2.7.2/bin$ java -version
> java version "1.7.0_95"
> OpenJDK Runtime Environment (IcedTea 2.6.4) (7u95-2.6.4-0ubuntu0.14.04.1)
> OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)
> Hadoop version: 2.7.2
>Reporter: Karthik Palaniappan
>
> Configuring aes 128 or aes 256 encryption 
> (dfs.encrypt.data.transfer.cipher.key.bitlength = [128, 256]) works perfectly 
> fine. Trying to use AES 192 generates this exception on the datanode:
> 16/02/29 17:34:10 ERROR datanode.DataNode: 
> wkstn-kpalaniappan:50010:DataXceiver error processing unknown operation  src: 
> /127.0.0.1:57237 dst: /127.0.0.1:50010
> java.lang.IllegalArgumentException: Invalid key length.
>   at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
>   at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
>   at 
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:128)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:109)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:396)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)
>   at 
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)
>   at java.lang.Thread.run(Thread.java:745)
> And this exception on the client:
> /hadoop-2.7.2/bin$ ./hdfs dfs -copyFromLocal ~/.vimrc /vimrc
> 16/02/29 17:34:10 WARN hdfs.DFSClient: DataStreamer Exception
> java.lang.IllegalArgumentException: Invalid key length.
>   at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
>   at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
>   at 
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:128)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:109)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair

[jira] [Commented] (HDFS-9878) Data transfer encryption with AES 192: Invalid key length.

2016-03-01 Thread Wei-Chiu Chuang (JIRA)

[ 
https://issues.apache.org/jira/browse/HDFS-9878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15173661#comment-15173661
 ] 

Wei-Chiu Chuang commented on HDFS-9878:
---

Good catch! [~karth295] thanks for reporting this issue. 
The native code for the cipher was implemented in HADOOP-10693 which does not 
support 192 bit encryption for OpenSSL from the beginning.
After that, HDFS-6606 decided to add a property to make the encryption length 
configurable, and states that 192 bit is supported, despite the HADOOP-10693 
implementation does not. Maybe this was intended for JCE?
https://issues.apache.org/jira/browse/HDFS-6606?focusedCommentId=14125921&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14125921

However, looks like OpenSSL actually can support 192 bit.
http://osxr.org:8080/openssl/source/crypto/evp/evp.h#0801

[~hitliuyi] would you comment on this? :) Thanks a lot.

> Data transfer encryption with AES 192: Invalid key length.
> --
>
> Key: HDFS-9878
> URL: https://issues.apache.org/jira/browse/HDFS-9878
> Project: Hadoop HDFS
>  Issue Type: Bug
>  Components: datanode, hdfs-client
>Affects Versions: 2.7.2
> Environment: OS: Ubuntu 14.04
> /hadoop-2.7.2/bin$ uname -a
> Linux wkstn-kpalaniappan 3.13.0-79-generic #123-Ubuntu SMP Fri Feb 19 
> 14:27:58 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
> /hadoop-2.7.2/bin$ java -version
> java version "1.7.0_95"
> OpenJDK Runtime Environment (IcedTea 2.6.4) (7u95-2.6.4-0ubuntu0.14.04.1)
> OpenJDK 64-Bit Server VM (build 24.95-b01, mixed mode)
> Hadoop version: 2.7.2
>Reporter: Karthik Palaniappan
>
> Configuring aes 128 or aes 256 encryption 
> (dfs.encrypt.data.transfer.cipher.key.bitlength = [128, 256]) works perfectly 
> fine. Trying to use AES 192 generates this exception on the datanode:
> 16/02/29 17:34:10 ERROR datanode.DataNode: 
> wkstn-kpalaniappan:50010:DataXceiver error processing unknown operation  src: 
> /127.0.0.1:57237 dst: /127.0.0.1:50010
> java.lang.IllegalArgumentException: Invalid key length.
>   at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
>   at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
>   at 
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:128)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:109)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:396)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)
>   at 
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)
>   at java.lang.Thread.run(Thread.java:745)
> And this exception on the client:
> /hadoop-2.7.2/bin$ ./hdfs dfs -copyFromLocal ~/.vimrc /vimrc
> 16/02/29 17:34:10 WARN hdfs.DFSClient: DataStreamer Exception
> java.lang.IllegalArgumentException: Invalid key length.
>   at org.apache.hadoop.crypto.OpensslCipher.init(Native Method)
>   at org.apache.hadoop.crypto.OpensslCipher.init(OpensslCipher.java:176)
>   at 
> org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec$OpensslAesCtrCipher.init(OpensslAesCtrCryptoCodec.java:116)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.updateDecryptor(CryptoInputStream.java:290)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.resetStreamOffset(CryptoInputStream.java:303)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:128)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:109)
>   at 
> org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490)
>   at 
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslData