Does anyone know this issue?
> On Nov 3, 2020, at 2:28 PM, ZongtianHou
> wrote:
>
> Hi, everyone
> I am setting up a secure cluster in auto HA mode. I got the following error
> when I start namenode, it seem the ssl connection to journal node is not
> configured
Hi, everyone
I am setting up a secure cluster in auto HA mode. I got the following error
when I start namenode, it seem the ssl connection to journal node is not
configured correctly. I generate keystore with keytool, set path and password
of truststore and keystore in ssl-server.xml and ssl-cli
Figure it out!
> On 9 Nov 2018, at 10:08 PM, ZongtianHou
> wrote:
>
> Hi, everyone,
>
> I have searched for answer for days and get nothing, when I format name node
> as my local user “kousouda”, the filesystem owner which should be “kousouda"
> become the k
Hi, everyone,
I have searched for answer for days and get nothing, when I format name node as
my local user “kousouda”, the filesystem owner which should be “kousouda"
become the kerberos principle of namenode.
And the klist tgt is “kousouda”, I can’t do anything in the hdfs because there
is no
org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
> On 9 Nov 2018, at 4:09 PM, ZongtianHou wrote:
>
> And another wired thing, when I try mkdir in hdfs, it show the following
> error,
> $hadoop dfs -mkdir /user
> DEPRECATED: Use of this script to execute hdfs command is deprecated.
>
="/":namenode:supergroup:drwxr-xr-x
why the owner of the root dir is namenode. Other than the user start the
namenode.
> On 9 Nov 2018, at 3:45 PM, ZongtianHou <mailto:zongtian...@icloud.com.INVALID>> wrote:
>
> I run all process with the same user. And it should be the superuser since
o start the service. Is datanode a superuser?
>>
>> Regards
>> Harinder
>>
>> On Thu, Nov 8, 2018 at 11:29 PM ZongtianHou > <mailto:zongtian...@icloud.com.invalid>> wrote:
>> Hi, everyone
>> I set up kerberos for the hdfs cluster, but after I start n
Hi, everyone
I set up kerberos for the hdfs cluster, but after I start name node, then the
datanode, In the namenode log file, it display the following error:
2018-11-09 15:09:38,725 WARN org.apache.hadoop.security.UserGroupInformation:
No groups available for user datanode
4870 2018-11-09 15:09
Hi, everyone:
I setup kerberos for Hadoop cluster, I can do hadoop dfs -ls /,
But when I do the below command, it tell I don’t have tgt, but I have tgt, I
can see it by klist. Does anyone know how to do it?This is the command and the
error info:
ssh -o StrictHostKeyChecking=no localhost "sudo -u
.e6fe%25...@exabeam.com%3E
>
> <http://mail-archives.apache.org/mod_mbox/hadoop-user/201604.mbox/%3cd32b25c1.e6fe%25...@exabeam.com%3E>
> dfs.block.access.token.enable to true
>
> On Tue, Jun 26, 2018 at 6:51 AM, ZongtianHou <mailto:zongtian...@icloud.com>> wrot
cryption?
> Yes, application (client) which is connected to DN while handshake
>
> From: ZongtianHou [mailto:zongtian...@icloud.com
> <mailto:zongtian...@icloud.com>]
> Sent: 27 June 2018 14:54
> To: user@hadoop.apache.org <mailto:user@hadoop.apache.org>
> Subjec
Does anyone have some clue about it? I have updated the jdk, and still cannot
solve the problem. Thx advance for any info!!
> On 27 Jun 2018, at 12:23 AM, ZongtianHou wrote:
>
> This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed
> to read expecte
This is the log info: org.apache.hadoop.hdfs.server.datanode.DataNode: Failed
to read expected encryption handshake from client at /127.0.0.1:53611. Perhaps
the client is running an older version of Hadoop which does not support
encryption
I have two more questions here.
1 what the client mean
Hi, everyone:
I have set up kerberos for Hadoop, the namenode can be accessed correctly, but
when I want to write some data in datanode, it give the error info:
Failed to read expected encryption handshake from client at /127.0.0.1:59789.
Perhaps the client is running an older version of Hadoop w
dfs.web.authentication.kerberos.principal
h...@cw.com
> On 21 Jun 2018, at 4:52 PM, ZongtianHou wrote:
>
> Hi,everyone:
> I am setting up kerberos for Hadoop cluster, but when starting the datanode,
> the following error happened:
> java.lang.RuntimeException: Cannot start secure DataNode without con
Hi,everyone:
I am setting up kerberos for Hadoop cluster, but when starting the datanode,
the following error happened:
java.lang.RuntimeException: Cannot start secure DataNode without configuring
either privileged resources or SASL RPC data transfer protection and SSL for
HTTP. Using privilege
16 matches
Mail list logo