Dmitry Goldenberg Fri, 11 Sep 2015 08:11:47 -0700
Is there a way to kill a laggard Spark job manually, and more importantly, is there a way to do it programmatically based on a configurable timeout value?
Thanks.