Gyorgy Gal created SPARK-36693: ---------------------------------- Summary: Implement spark-shell idle timeouts Key: SPARK-36693 URL: https://issues.apache.org/jira/browse/SPARK-36693 Project: Spark Issue Type: New Feature Components: Spark Shell Affects Versions: 3.1.2 Reporter: Gyorgy Gal
Many customers have been asking if there is a setting they can use to kill idle spark-shell since they can't really go to each developers desk and force them to use Cntr+D or exit() when their work is over. Our response so far has been to use dynamic allocation so that it will release the executors after the specified timeout. However this is not always an ideal solution since the shell process would still be there, though AM would be occupying a very small resource and the user still needs to kill the idle spark shell via CM> Applications > spark-shell > Kill or run 'kill -9' on the OS to remove those. It would be nice to have a property in Spark (and exposed in CM) which deals with idle spark-shells, just like we have in beeline and let's leave it to the admins to see if they want the idle spark-shell timeout to be set as 1 day or a week. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org