hi

we have many users on the spark on yarn cluster. most of them forget to
release their sparkcontext after analysis (spark-thrift or pyspark
jupyter kernels).

I wonder how to detect their is no activity on the sparkcontext to kill
them.

Thanks
-- 
nicolas

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to