Hi All ,
Does livy support concurrent spark job submission  on a single spark
session (application ) ?
I have set following properties

"spark.scheduler.mode" :"FAIR",

  "spark.scheduler.pool": "production",
   "spark.scheduler.allocation.file": "/home/hadoop/fair_scheduler.xml",

> ```


If yes , what needs to be checked ?

Reply via email to