Cool, thanks for the info, JM. Thinking out loud..
* Could be missing/inaccurate /etc/krb5.conf on the nodes running spark
tasks
* Could try setting the Java system property
sun.security.krb5.debug=true in the Spark executors
* Could try to set org.apache.hadoop.security=DEBUG in log4j config
Hard to guess at the real issue without knowing more :). Any more
context you can share, I'd be happy to try to help.
(ps. obligatory warning about PHOENIX-3189 if you're using 4.8.0)
Jean-Marc Spaggiari wrote:
Using the keytab in the JDBC URL. That the way we use locally and we
also tried to run command line applications directly from the worker
nodes and it works, But inside the Spark Executor it doesn't...
2016-09-15 13:07 GMT-04:00 Josh Elser <josh.el...@gmail.com
<mailto:josh.el...@gmail.com>>:
How do you expect JDBC on Spark Kerberos authentication to work? Are
you using the principal+keytab options in the Phoenix JDBC URL or is
Spark itself obtaining a ticket for you (via some "magic")?
Jean-Marc Spaggiari wrote:
Hi,
I tried to build a small app all under Kerberos.
JDBC to Phoenix works
Client to HBase works
Client (puts) on Spark to HBase works.
But JDBC on Spark to HBase fails with a message like
"GSSException: No
valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]"
Keytab is accessible on all the nodes.
Keytab belongs to the user running the job, and executors are
running
under that user name. So this is fine.
Any idea of that this might be?
Thanks,
JMS