Which user did you run your program as ?

Have you granted proper permission on hbase side ?

You should also check master log to see if there was some clue. 

Cheers



> On May 19, 2015, at 2:41 AM, donhoff_h <165612...@qq.com> wrote:
> 
> Hi, experts.
> 
> I ran the "HBaseTest" program which is an example from the Apache Spark 
> source code to learn how to use spark to access HBase. But I met the 
> following exception:
> Exception in thread "main" 
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after 
> attempts=36, exceptions:
> Tue May 19 16:59:11 CST 2015, null, java.net.SocketTimeoutException: 
> callTimeout=60000, callDuration=68648: row 'spark_t01,,00000000000000' on 
> table 'hbase:meta' at region=hbase:meta,,1.1588230740, 
> hostname=bgdt01.dev.hrb,16020,1431412877700, seqNum=0
> 
> I also checked the RegionServer Log of the host "bgdt01.dev.hrb" listed in 
> the above exception. I found a few entries like the following one:
> 2015-05-19 16:59:11,143 DEBUG 
> [RpcServer.reader=2,bindAddress=bgdt01.dev.hrb,port=16020] ipc.RpcServer: 
> RpcServer.listener,port=16020: Caught exception while reading:Authentication 
> is required 
> 
> The above entry did not point to my program clearly. But the time is very 
> near. Since my hbase version is HBase1.0.0 and I set security enabled, I 
> doubt the exception was caused by the Kerberos authentication.  But I am not 
> sure.
> 
> Do anybody know if my guess is right? And if I am right, could anybody tell 
> me how to set Kerberos Authentication in a spark program? I don't know how to 
> do it. I already checked the API doc , but did not found any API useful. Many 
> Thanks!
> 
> By the way, my spark version is 1.3.0. I also paste the code of "HBaseTest" 
> in the following:
> ***************************Source Code******************************
> object HBaseTest {
>   def main(args: Array[String]) {
>     val sparkConf = new SparkConf().setAppName("HBaseTest")
>     val sc = new SparkContext(sparkConf)
>     val conf = HBaseConfiguration.create()
>     conf.set(TableInputFormat.INPUT_TABLE, args(0))
> 
>     // Initialize hBase table if necessary
>     val admin = new HBaseAdmin(conf)
>     if (!admin.isTableAvailable(args(0))) {
>       val tableDesc = new HTableDescriptor(args(0))
>       admin.createTable(tableDesc)
>     }
> 
>     val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
>       classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
>       classOf[org.apache.hadoop.hbase.client.Result])
> 
>     hBaseRDD.count()
> 
>     sc.stop()
>   }
> }
> 

Reply via email to