GitHub user shashihoushengqia created a discussion: Questions regarding the 
submission of jar files

I found that every time I submitted a Spark on K8s task through Kyuubi, a file 
named "kyuubi-spark-sql-engine_2.12-1.10.3.jar" would be uploaded to S3/HDFS. 
This would occupy a large amount of space on S3/HDFS, and uploading it every 
time would also slow down the startup speed of Spark. So I placed this jar file 
in /opt/spark/jars/ in all the Spark containers, and also in /opt/kyuubi/jars 
in the Kyuubi container. However, Kyuubi still uploads this jar file to 
S3/HDFS. So, experts, how should we solve this problem?

GitHub link: https://github.com/apache/kyuubi/discussions/7337

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: 
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to