Z1Wu opened a new issue, #7188: URL: https://github.com/apache/kyuubi/issues/7188
### Code of Conduct - [x] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) ### Search before asking - [x] I have searched in the [issues](https://github.com/apache/kyuubi/issues?q=is%3Aissue) and found no similar issues. ### Describe the feature - Support starting SparkSQLEngine with an external delegation token file to reduce unnecessary delegation token creation. - Add YarnRMDelegationTokenProvider to get Yarn RM delegation token which is required when submitting spark yarn application without tgt or keytabs. ### Motivation Spark applications launched in the proxy-user + cluster mode will automatically request delegation tokens in `spark-submit`. ``` scala // org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend#start override def start(): Unit = { if (UserGroupInformation.isSecurityEnabled()) { delegationTokenManager = createTokenManager() delegationTokenManager.foreach { dtm => val ugi = UserGroupInformation.getCurrentUser() val tokens = if (dtm.renewalEnabled) { dtm.start() } else { val creds = ugi.getCredentials() // fetch all related delegation tokens dtm.obtainDelegationTokens(creds) if (creds.numberOfTokens() > 0 || creds.numberOfSecretKeys() > 0) { SparkHadoopUtil.get.serialize(creds) } else { null } } if (tokens != null) { updateDelegationTokens(tokens) } } } } ``` However, when `SparkSQLEngine` starts, these tokens will be overwritten (without canceling) by the tokens configured in spark-conf by Kyuubi Server . As we can see in the code below : ```scala // org.apache.kyuubi.engine.spark.SparkSQLEngine#createSpark def createSpark(): SparkSession = { val engineCredentials = kyuubiConf.getOption(KyuubiReservedKeys.KYUUBI_ENGINE_CREDENTIALS_KEY) kyuubiConf.unset(KyuubiReservedKeys.KYUUBI_ENGINE_CREDENTIALS_KEY) _sparkConf.set(s"spark.${KyuubiReservedKeys.KYUUBI_ENGINE_CREDENTIALS_KEY}", "") val session = SparkSession.builder.config(_sparkConf).getOrCreate engineCredentials.filter(_.nonEmpty).foreach { credentials => // reset all delegation token SparkTBinaryFrontendService.renewDelegationToken(session.sparkContext, credentials) } .... } ``` In our production environment, more than 100000 SparkSQLEngine are launched every day through kyuubi, which creates lots of unnecessary delegation tokens in shared delegation token storage. <img width="1310" height="574" alt="Image" src="https://github.com/user-attachments/assets/b3e06dd9-dbef-465e-bafd-1d11326e56cc" /> ### Describe the solution Spark natively supports passing tokens through an external Delegation Token File. As we can see in this doc https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/security/README.md We can submit spark application with external delegation token file to avoid unnecessary token creation in `spark-submit` and let kyuubi server manage all delegation tokens for SparkSQLEngine. ### Additional context _No response_ ### Are you willing to submit PR? - [x] Yes. I would be willing to submit a PR with guidance from the Kyuubi community to improve. - [ ] No. I cannot submit a PR at this time. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
