Hi All, We are building a spark streaming application and that application writes data to HBase table. But writes/reads are failing with following exception
16/06/13 04:35:16 ERROR ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'. javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179) at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:605) This application is failing at Executor machine. Executor is not able to pass the token. Can someone help me how to resolve this issue. *Environment Details* Spark Version : 1.6.1 HBase Version : 1.0.0 Hadoop Version : 2.6.0 -- Thanks & Regards Kamesh.