Hi,
The exception is the same as before. Just like the following:
2015-05-23 18:01:40,943 ERROR [hconnection-0x14027b82-shared--pool1-t1]
ipc.AbstractRpcClient: SASL authentication failed. The most likely cause is
missing or invalid credentials. Consider 'kinit'.
Can you post the morning modified code ?
Thanks
On May 21, 2015, at 11:11 PM, donhoff_h 165612...@qq.com wrote:
Hi,
Thanks very much for the reply. I have tried the SecurityUtil. I can see
from log that this statement executed successfully, but I still can not pass
the
Hi,
My modified code is listed below, just add the SecurityUtil API. I don't know
which propertyKeys I should use, so I make 2 my own propertyKeys to find the
keytab and principal.
object TestHBaseRead2 {
def main(args: Array[String]) {
val conf = new SparkConf()
val sc = new
Hi,
Thanks very much for the reply. I have tried the SecurityUtil. I can see
from log that this statement executed successfully, but I still can not pass
the authentication of HBase. And with more experiments, I found a new
interesting senario. If I run the program with yarn-client mode, the
Can you share the exception(s) you encountered ?
Thanks
On May 22, 2015, at 12:33 AM, donhoff_h 165612...@qq.com wrote:
Hi,
My modified code is listed below, just add the SecurityUtil API. I don't
know which propertyKeys I should use, so I make 2 my own propertyKeys to find
the
You might also enable debug in: hadoop-env.sh
# Extra Java runtime options. Empty by default.
export HADOOP_OPTS=$HADOOP_OPTS -Djava.net.preferIPv4Stack=true
-Dsun.security.krb5.debug=true ${HADOOP_OPTS}”
and check that the principals are the same on the NameNode and DataNode.
and you can
Hi,
Many thanks for the help. My Spark version is 1.3.0 too and I run it on Yarn.
According to your advice I have changed the configuration. Now my program can
read the hbase-site.xml correctly. And it can also authenticate with zookeeper
successfully.
But I meet a new problem that is my
Are the worker nodes colocated with HBase region servers ?
Were you running as hbase super user ?
You may need to login, using code similar to the following:
if (isSecurityEnabled()) {
SecurityUtil.login(conf, fileConfKey, principalConfKey, localhost);
}
SecurityUtil is
What I found with the CDH-5.4.1 Spark 1.3, the
spark.executor.extraClassPath setting is not working. Had to use
SPARK_CLASSPATH instead.
On Thursday, May 21, 2015, Ted Yu yuzhih...@gmail.com wrote:
Are the worker nodes colocated with HBase region servers ?
Were you running as hbase super user
I have similar problem that I cannot pass the HBase configuration file as
extra classpath to Spark any more using
spark.executor.extraClassPath=MY_HBASE_CONF_DIR in the Spark 1.3. We used
to run this in 1.2 without any problem.
On Tuesday, May 19, 2015, donhoff_h 165612...@qq.com wrote:
Sorry,
Hi, experts.
I ran the HBaseTest program which is an example from the Apache Spark source
code to learn how to use spark to access HBase. But I met the following
exception:
Exception in thread main
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=36, exceptions:
Which user did you run your program as ?
Have you granted proper permission on hbase side ?
You should also check master log to see if there was some clue.
Cheers
On May 19, 2015, at 2:41 AM, donhoff_h 165612...@qq.com wrote:
Hi, experts.
I ran the HBaseTest program which is an
Sorry, this ref does not help me. I have set up the configuration in
hbase-site.xml. But it seems there are still some extra configurations to be
set or APIs to be called to make my spark program be able to pass the
authentication with the HBase.
Does anybody know how to set authentication to
The principal is sp...@bgdt.dev.hrb. It is the user that I used to run my spark
programs. I am sure I have run the kinit command to make it take effect. And I
also used the HBase Shell to verify that this user has the right to scan and
put the tables in HBase.
Now I still have no idea how to
Please take a look at:
http://hbase.apache.org/book.html#_client_side_configuration_for_secure_operation
Cheers
On Tue, May 19, 2015 at 5:23 AM, donhoff_h 165612...@qq.com wrote:
The principal is sp...@bgdt.dev.hrb. It is the user that I used to run my
spark programs. I am sure I have run
15 matches
Mail list logo