Hi Dima,

I followed the official reference guide now, but still same error. Attached
is the hbase-site.xml file, please have a look. What's wrong there?

On Thu, Jul 28, 2016 at 11:58 PM, Dima Spivak <dspi...@cloudera.com> wrote:

> I haven't looked in detail at your hbase-site.xml, but if you're running
> Apache HBase (and not a CDH release), I might recommend using the official
> reference guide [1] to configure your cluster instead of the CDH 4.2.0 docs
> since those would correspond to HBase 0.94, and might well have different
> steps required to set up security. If you are trying out CDH HBase, be sure
> to use up-to-date documentation for your release.
>
> Let us know how it goes.
>
> [1] https://hbase.apache.org/book.html#hbase.secure.configuration
>
> -Dima
>
> On Thu, Jul 28, 2016 at 10:09 AM, Aneela Saleem <ane...@platalytics.com>
> wrote:
>
> > Hi Dima,
> >
> > I'm running Hbase version 1.2.2
> >
> > On Thu, Jul 28, 2016 at 8:35 PM, Dima Spivak <dspi...@cloudera.com>
> wrote:
> >
> > > Hi Aneela,
> > >
> > > What version of HBase are you running?
> > >
> > > -Dima
> > >
> > > On Thursday, July 28, 2016, Aneela Saleem <ane...@platalytics.com>
> > wrote:
> > >
> > > > Hi,
> > > >
> > > > I have successfully configured Zookeeper with Kerberos
> authentication.
> > > Now
> > > > i'm facing issue while configuring HBase with Kerberos
> authentication.
> > I
> > > > have followed this link
> > > > <
> > >
> >
> http://www.cloudera.com/documentation/archive/cdh/4-x/4-2-0/CDH4-Security-Guide/cdh4sg_topic_8_2.html
> > > >.
> > > > Attached are the configuration files, i.e., hbase-site.xml and
> > > > zk-jaas.conf.
> > > >
> > > > Following are the logs from regionserver:
> > > >
> > > > 016-07-28 17:44:56,881 WARN  [regionserver/hadoop-master/
> > > > 192.168.23.206:16020] regionserver.HRegionServer: error telling
> master
> > > we
> > > > are up
> > > > com.google.protobuf.ServiceException: java.io.IOException: Could not
> > set
> > > > up IO Streams to hadoop-master/192.168.23.206:16000
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:240)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos$RegionServerStatusService$BlockingStub.regionServerStartup(RegionServerStatusProtos.java:8982)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.reportForDuty(HRegionServer.java:2284)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.regionserver.HRegionServer.run(HRegionServer.java:906)
> > > > at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.IOException: Could not set up IO Streams to
> > > > hadoop-master/192.168.23.206:16000
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:785)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> > > > at
> > > org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1241)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> > > > ... 5 more
> > > > Caused by: java.lang.RuntimeException: SASL authentication failed.
> The
> > > > most likely cause is missing or invalid credentials. Consider
> 'kinit'.
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$1.run(RpcClientImpl.java:685)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.handleSaslConnectionFailure(RpcClientImpl.java:643)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:751)
> > > > ... 9 more
> > > > Caused by: javax.security.sasl.SaslException: GSS initiate failed
> > [Caused
> > > > by GSSException: No valid credentials provided (Mechanism level:
> Failed
> > > to
> > > > find any Kerberos tgt)]
> > > > at
> > > >
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> > > > ... 9 more
> > > > Caused by: GSSException: No valid credentials provided (Mechanism
> > level:
> > > > Failed to find any Kerberos tgt)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
> > > > at
> > > >
> > >
> >
> sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223)
> > > > at
> > >
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
> > > > at
> > >
> sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
> > > > at
> > > >
> > >
> >
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
> > > >
> > > >
> > > > Please have a look, whats going wrong here?
> > > >
> > > > Thanks
> > > >
> > > >
> > >
> >
>
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
/**
 *
 * Licensed to the Apache Software Foundation (ASF) under one
 * or more contributor license agreements.  See the NOTICE file
 * distributed with this work for additional information
 * regarding copyright ownership.  The ASF licenses this file
 * to you under the Apache License, Version 2.0 (the
 * "License"); you may not use this file except in compliance
 * with the License.  You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */
-->
<configuration>
	<property>
		<name>hbase.rootDir</name>
		<value>hdfs://hadoop-master:8020/hbase</value>
	</property>
	<property>
		<name>hbase.zookeeper.property.clientPort</name>
		<value>2181</value>
	</property>
	 <property>
      		<name>hbase.zookeeper.quorum</name>
      		<value>hadoop-master</value>
   	</property>
   	<property>
      		<name>hbase.cluster.distributed</name>
      		<value>true</value>
   	</property> 
	<property>
     		<name>hbase.rpc.engine</name>
     		<value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
	</property>
	<property>
    		<name>hbase.security.authentication</name>
    		<value>kerberos</value> 
	</property>
 	<property>
        	<name>hbase.security.authorization</name>
        	<value>true</value>
      	</property>
      	<property>
      		<name>hbase.coprocessor.region.classes</name>
        	<value>org.apache.hadoop.hbase.security.token.TokenProvider</value>
      	</property>
	<property>
  		<name>hbase.rpc.protection</name>
  		<value>privacy</value>
	</property>
<property>
  <name>hadoop.security.authorization</name>
  <value>true</value>
</property>
<property>
  <name>hadoop.proxyuser.hdfs.groups</name>
  <value>users</value>
</property>
<property>
  <name>hadoop.proxyuser.hdfs.hosts</name>
  <value>*</value>
</property>
<property>
  <name>hbase.rest.authentication.type</name>
  <value>kerberos</value>
</property>
<property>
  <name>hbase.rest.authentication.kerberos.principal</name>
  <value>HTTP/_HOST@platalyticsrealm</value>
</property>
<property>
  <name>hbase.rest.authentication.kerberos.keytab</name>
  <value>/etc/hadoop/conf/nn.keytab</value>
</property>
	<property> 
  		<name>hbase.regionserver.kerberos.principal</name> 
  		<value>hbase/_HOST@platalyticsrealm</value> 
	</property> 

	<property> 
  		<name>hbase.regionserver.keytab.file</name> 
  		<value>/etc/hadoop/conf/hbase.keytab</value> 
	</property>

	<property> 
  		<name>hbase.master.kerberos.principal</name> 
  		<value>hbase/_HOST@platalyticsrealm</value> 
	</property> 

	<property> 
		<name>hbase.master.keytab.file</name> 
		<value>/etc/hadoop/conf/hbase.keytab</value> 
	</property>
</configuration>

Reply via email to