xieshuaihu commented on PR #46278:
URL: https://github.com/apache/spark/pull/46278#issuecomment-2084128139

   Let me clarify this question.
   
   In vanilla spark, we could do this
   
   ```scala
   // create context, this config could be only once
   // or set in spark-submit: --conf spark.scheduler.mode=FAIR --conf 
spark.scheduler.allocation.file=file:///path/to/file
   SparkSession.builder.config("spark.scheduler.mode", 
"FAIR").config("spark.scheduler.allocation.file", 
"file:///path/to/file").getOrCreate()
   
   // in one thread, could set its pool as "pool1"
   val spark = sparkSparkSession.builder.getOrCreate()
   spark.sparkContext.setLocalProperty("spark.scheduler.pool", "pool1")
   
   // in another thread, could set its pool as "pool2"
   val spark = sparkSparkSession.builder.getOrCreate()
   spark.sparkContext.setLocalProperty("spark.scheduler.pool", "pool2")
   
   // Note: pool1 and pool2 should be defined in this file 
'file:///path/to/file'
   ```
   
   But current spark connect don't support set jobs pool, in other words, jobs 
submitted by connect client cannot change sparkContext's local property 
"spark.scheduler.pool"


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to