Hi Zongtian,
This is definitely not a JDK issue. This is the wire-protocol compatibility
between client and server (DataNode).

bq. what the client mean, it mean the application running on hdfs, how does
it have a encryption?
I'm not quite sure what you asked. HDFS supports at-rest encryption, data
transfer encryption, RPC encryption and SSL encryption.

I'd recommend you to make sure your Hadoop client version is the same as
the server version. The log message suggests the DataNode is on Hadoop
2.7.0+ version.

On Wed, Jun 27, 2018 at 2:24 AM ZongtianHou <zongtian...@icloud.com> wrote:

> Does anyone have some clue about it? I have updated the jdk, and still
> cannot solve the problem. Thx advance for any info!!
>
> On 27 Jun 2018, at 12:23 AM, ZongtianHou <zongtian...@icloud.com> wrote:
>
> This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode:
> Failed to read expected encryption handshake from client at /
> 127.0.0.1:53611. Perhaps the client is running an  older version of
> Hadoop which does not support encryption
>
> I have two more questions here.
> 1 what the client mean, it mean the application running on hdfs, how does
> it have a encryption?
> 2 I have turn off the encryption about data transfer, rpc protection, http
> protection by setting properties of  hadoop.rpc.protection, 
> dfs.encrypt.data.transfer
> and dfs.http.policy as false, why there is still encryption?
>
> Any clue will be appreciated.
>
>
>

-- 
A very happy Clouderan

Reply via email to