Unknowingly "dfs.encrypt.data.transfer" configured as true in datanode??
Please cross with check datanode configurations using the following way.
http://<DN_IP>:<httpport>/conf<http://%3cDN_IP%3e:%3chttpport%3e/conf>

bq.1 what the client mean, it mean the application running on hdfs, how does it 
have a encryption?
Yes, application (client) which is connected to DN while handshake

From: ZongtianHou [mailto:zongtian...@icloud.com]
Sent: 27 June 2018 14:54
To: user@hadoop.apache.org
Subject: Re: Security problem extra

Does anyone have some clue about it? I have updated the jdk, and still cannot 
solve the problem. Thx advance for any info!!
On 27 Jun 2018, at 12:23 AM, ZongtianHou 
<zongtian...@icloud.com<mailto:zongtian...@icloud.com>> wrote:

This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed 
to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps 
the client is running an  older version of Hadoop which does not support 
encryption

I have two more questions here.
1 what the client mean, it mean the application running on hdfs, how does it 
have a encryption?
2 I have turn off the encryption about data transfer, rpc protection, http 
protection by setting properties of  hadoop.rpc.protection, 
dfs.encrypt.data.transfer and dfs.http.policy as false, why there is still 
encryption?

Any clue will be appreciated.

Reply via email to