1. the issue about that Kerberos expires.
* You don’t need to care aboubt usually, you can use the local keytab at
every node in the Hadoop cluster.
* If there don’t have the keytab in your Hadoop cluster, you will need
update your keytab in every executor periodically。
2.
Hi,
Does any best practices about how to manage Hbase connections with
kerberos authentication in Spark Streaming (YARN) environment?
Want to now how executors manage the HBase connections,how to create
them, close them and refresh Kerberos expires.
Thanks.