Hi, I am trying to configure hadoop pseudo node secure cluster (to ensure proper working) in Azure using Azure Domain Service.
OS - Windows Server 2012 R2 Datacenter Hadoop Version - 2.7.2 I can able to run *hadoop fs -ls /* Example MapReduce job works fine *yarn jar %HADOOP_HOME%\share\hadoop\mapreduce\hadoop-mapreduce-examples-*.jar pi 16 10000* But when i run, *hdfs fsck /* it gives, *Connecting to namenode via https://node1:50470/fsck?ugi=Kumar&path=%2F <https://node1:50470/fsck?ugi=Kumar&path=%2F>Exception in thread "main" java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails) at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:335) at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:73) at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:152) at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:149) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.hdfs.tools.DFSck.run(DFSck.java:148) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at org.apache.hadoop.hdfs.tools.DFSck.main(DFSck.java:377)Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos credentails) at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274) at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77) at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214) at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215) at org.apache.hadoop.hdfs.web.URLConnectionFactory.openConnection(URLConnectionFactory.java:161) at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:333) ... 10 more* When i access namenode web ui, it shows *GSSException: Failure unspecified at GSS-API level (Mechanism level: Encryption type AES256 CTS mode with HMAC SHA1-96 is not supported/enabled)[image: Inline image 1]* Someone help me to resolve this error and get it work successfully.