Hi all, We have a long running PySpark session running on client mode that occasionally dies.
We'd like to check whether the session is still alive. One solution we came up with was checking whether the UI is still up, but we were wondering if there's maybe an easier way then that. Maybe something like spark.getActiveSession() might do the same. I noticed that it throws a connection refused error if the current spark session dies. Are there any official/suggested ways to check this? I couldn't find much in the docs/previous mailing lists. Kind regards, Yeachan