We are considering deploying a notebook server for use by two kinds of users
1. interactive dashboard. > 1. I.e. Forms allow users to select data sets and visualizations > 2. Review real time graphs of data captured by our spark streams 2. General notebooks for Data Scientists My concern is interactive spark jobs can can consume a lot of cluster resource and many users may be sloppy/lazy. I.E. Just kill their browsers instead of shutting down their notebooks cleanly What are best practices? Kind regards Andy