You might also enable debug in: hadoop-env.sh
# Extra Java runtime options. Empty by default.
export HADOOP_OPTS=$HADOOP_OPTS -Djava.net.preferIPv4Stack=true
-Dsun.security.krb5.debug=true ${HADOOP_OPTS}”
and check that the principals are the same on the NameNode and DataNode.
and you can
Are the worker nodes colocated with HBase region servers ?
Were you running as hbase super user ?
You may need to login, using code similar to the following:
if (isSecurityEnabled()) {
SecurityUtil.login(conf, fileConfKey, principalConfKey, localhost);
}
SecurityUtil is
What I found with the CDH-5.4.1 Spark 1.3, the
spark.executor.extraClassPath setting is not working. Had to use
SPARK_CLASSPATH instead.
On Thursday, May 21, 2015, Ted Yu yuzhih...@gmail.com wrote:
Are the worker nodes colocated with HBase region servers ?
Were you running as hbase super user
I have similar problem that I cannot pass the HBase configuration file as
extra classpath to Spark any more using
spark.executor.extraClassPath=MY_HBASE_CONF_DIR in the Spark 1.3. We used
to run this in 1.2 without any problem.
On Tuesday, May 19, 2015, donhoff_h 165612...@qq.com wrote:
Sorry,
Which user did you run your program as ?
Have you granted proper permission on hbase side ?
You should also check master log to see if there was some clue.
Cheers
On May 19, 2015, at 2:41 AM, donhoff_h 165612...@qq.com wrote:
Hi, experts.
I ran the HBaseTest program which is an
Please take a look at:
http://hbase.apache.org/book.html#_client_side_configuration_for_secure_operation
Cheers
On Tue, May 19, 2015 at 5:23 AM, donhoff_h 165612...@qq.com wrote:
The principal is sp...@bgdt.dev.hrb. It is the user that I used to run my
spark programs. I am sure I have run