I have similar problem that I cannot pass the HBase configuration file as
extra classpath to Spark any more using
spark.executor.extraClassPath=MY_HBASE_CONF_DIR in the Spark 1.3. We used
to run this in 1.2 without any problem.

On Tuesday, May 19, 2015, donhoff_h <165612...@qq.com> wrote:

>
> Sorry, this ref does not help me.  I have set up the configuration in
> hbase-site.xml. But it seems there are still some extra configurations to
> be set or APIs to be called to make my spark program be able to pass the
> authentication with the HBase.
>
> Does anybody know how to set authentication to a secured HBase in a spark
> program which use the API "newAPIHadoopRDD" to get information from HBase?
>
> Many Thanks!
>
> ------------------ 原始邮件 ------------------
> *发件人:* "yuzhihong";<yuzhih...@gmail.com
> <javascript:_e(%7B%7D,'cvml','yuzhih...@gmail.com');>>;
> *发送时间:* 2015年5月19日(星期二) 晚上9:54
> *收件人:* "donhoff_h"<165612...@qq.com
> <javascript:_e(%7B%7D,'cvml','165612...@qq.com');>>;
> *抄送:* "user"<user@spark.apache.org
> <javascript:_e(%7B%7D,'cvml','user@spark.apache.org');>>;
> *主题:* Re: How to use spark to access HBase with Security enabled
>
> Please take a look at:
>
> http://hbase.apache.org/book.html#_client_side_configuration_for_secure_operation
>
> Cheers
>
> On Tue, May 19, 2015 at 5:23 AM, donhoff_h <165612...@qq.com
> <javascript:_e(%7B%7D,'cvml','165612...@qq.com');>> wrote:
>
>>
>> The principal is sp...@bgdt.dev.hrb. It is the user that I used to run
>> my spark programs. I am sure I have run the kinit command to make it take
>> effect. And I also used the HBase Shell to verify that this user has the
>> right to scan and put the tables in HBase.
>>
>> Now I still have no idea how to solve this problem. Can anybody help me
>> to figure it out? Many Thanks!
>>
>> ------------------ 原始邮件 ------------------
>> *发件人:* "yuzhihong";<yuzhih...@gmail.com
>> <javascript:_e(%7B%7D,'cvml','yuzhih...@gmail.com');>>;
>> *发送时间:* 2015年5月19日(星期二) 晚上7:55
>> *收件人:* "donhoff_h"<165612...@qq.com
>> <javascript:_e(%7B%7D,'cvml','165612...@qq.com');>>;
>> *抄送:* "user"<user@spark.apache.org
>> <javascript:_e(%7B%7D,'cvml','user@spark.apache.org');>>;
>> *主题:* Re: How to use spark to access HBase with Security enabled
>>
>> Which user did you run your program as ?
>>
>> Have you granted proper permission on hbase side ?
>>
>> You should also check master log to see if there was some clue.
>>
>> Cheers
>>
>>
>>
>> On May 19, 2015, at 2:41 AM, donhoff_h <165612...@qq.com
>> <javascript:_e(%7B%7D,'cvml','165612...@qq.com');>> wrote:
>>
>> Hi, experts.
>>
>> I ran the "HBaseTest" program which is an example from the Apache Spark
>> source code to learn how to use spark to access HBase. But I met the
>> following exception:
>> Exception in thread "main"
>> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
>> attempts=36, exceptions:
>> Tue May 19 16:59:11 CST 2015, null, java.net.SocketTimeoutException:
>> callTimeout=60000, callDuration=68648: row 'spark_t01,,00000000000000' on
>> table 'hbase:meta' at region=hbase:meta,,1.1588230740,
>> hostname=bgdt01.dev.hrb,16020,1431412877700, seqNum=0
>>
>> I also checked the RegionServer Log of the host "bgdt01.dev.hrb" listed
>> in the above exception. I found a few entries like the following one:
>> 2015-05-19 16:59:11,143 DEBUG
>> [RpcServer.reader=2,bindAddress=bgdt01.dev.hrb,port=16020] ipc.RpcServer:
>> RpcServer.listener,port=16020: Caught exception while
>> reading:Authentication is required
>>
>> The above entry did not point to my program clearly. But the time is very
>> near. Since my hbase version is HBase1.0.0 and I set security enabled, I
>> doubt the exception was caused by the Kerberos authentication.  But I am
>> not sure.
>>
>> Do anybody know if my guess is right? And if I am right, could anybody
>> tell me how to set Kerberos Authentication in a spark program? I don't know
>> how to do it. I already checked the API doc , but did not found any API
>> useful. Many Thanks!
>>
>> By the way, my spark version is 1.3.0. I also paste the code of
>> "HBaseTest" in the following:
>> ***************************Source Code******************************
>> object HBaseTest {
>>   def main(args: Array[String]) {
>>     val sparkConf = new SparkConf().setAppName("HBaseTest")
>>     val sc = new SparkContext(sparkConf)
>>     val conf = HBaseConfiguration.create()
>>     conf.set(TableInputFormat.INPUT_TABLE, args(0))
>>
>>     // Initialize hBase table if necessary
>>     val admin = new HBaseAdmin(conf)
>>     if (!admin.isTableAvailable(args(0))) {
>>       val tableDesc = new HTableDescriptor(args(0))
>>       admin.createTable(tableDesc)
>>     }
>>
>>     val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
>>       classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
>>       classOf[org.apache.hadoop.hbase.client.Result])
>>
>>     hBaseRDD.count()
>>
>>     sc.stop()
>>   }
>> }
>>
>>
>

-- 
Many thanks.


Bill

Reply via email to