Hi, I'd like a specific job to fail if there's another instance of it already running on the cluster (Spark Standalone in my case). How to achieve this?
Thank you. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org