I haven't looked in detail at your hbase-site.xml, but if you're running
Apache HBase (and not a CDH release), I might recommend using the official
reference guide [1] to configure your cluster instead of the CDH 4.2.0 docs
since those would correspond to HBase 0.94, and might well have different
steps required to set up security. If you are trying out CDH HBase, be sure
to use up-to-date documentation for your release.

Let us know how it goes.

[1] https://hbase.apache.org/book.html#hbase.secure.configuration

-Dima

On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <ane...@platalytics.com>
wrote:

> Hi Dima,
>
> I'm running Hbase version 1.2.2
>
> On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <dspi...@cloudera.com> wrote:
>
> > Hi Aneela,
> >
> > What version of HBase are you running?
> >
> > -Dima
> >
> > On Thursday, July 28, 2016, Aneela Saleem <ane...@platalytics.com>
> wrote:
> >
> > > Hi,
> > >
> > > I have successfully configured Zookeeper with Kerberos authentication.
> > Now
> > > i'm facing issue while configuring HBase with Kerberos authentication.
> I
> > > have followed this link
> > > <
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > >.
> > > Attached are the configuration files, i.e., hbase-site.xml and
> > > zk-jaas.conf.
> > >
> > > Following are the logs from regionserver:
> > >
> > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > > 192.168.23.206:16020] regionserver.HRegionServer: error telling master
> > we
> > > are up
> > > com.google.protobuf.ServiceException: java.io.IOException: Could not
> set
> > > up IO Streams to hadoop-master/192.168.23.206:16000
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > > at
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > > at
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > > at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.IOException: Could not set up IO Streams to
> > > hadoop-master/192.168.23.206:16000
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > > at
> > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > > ... 5 more
> > > Caused by: java.lang.RuntimeException: SASL authentication failed. The
> > > most likely cause is missing or invalid credentials. Consider 'kinit'.
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > at
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > > ... 9 more
> > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
> [Caused
> > > by GSSException: No valid credentials provided (Mechanism level: Failed
> > to
> > > find any Kerberos tgt)]
> > > at
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > > at
> > >
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > at
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > at
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > > ... 9 more
> > > Caused by: GSSException: No valid credentials provided (Mechanism
> level:
> > > Failed to find any Kerberos tgt)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > > at
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > > at
> > >
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > > at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > > at
> > sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > > at
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > >
> > >
> > > Please have a look, whats going wrong here?
> > >
> > > Thanks
> > >
> > >
> >
>

Reply via email to