1. the issue about that Kerberos expires. * You don’t need to care aboubt usually, you can use the local keytab at every node in the Hadoop cluster. * If there don’t have the keytab in your Hadoop cluster, you will need update your keytab in every executor periodically。 2. best practices about how to manage Hbase connections with kerberos authentication, the demo.java is the code about how to get the hbase connection.
From: big data <bigdatab...@outlook.com> Date: Tuesday, November 24, 2020 at 1:58 PM To: "user@spark.apache.org" <user@spark.apache.org> Subject: how to manage HBase connections in Executors of Spark Streaming ? Hi, Does any best practices about how to manage Hbase connections with kerberos authentication in Spark Streaming (YARN) environment? Want to now how executors manage the HBase connections,how to create them, close them and refresh Kerberos expires. Thanks.
demo.java
Description: demo.java
--------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org