Hello everyone!
Like the title.
I start the Spark SQL 1.2.0 thrift server. Use beeline connect to the server to
execute SQL.
I want to kill one SQL job running in the thrift server and not kill the thrift
server.
I set property spark.ui.killEnabled=true in spark-default.conf
But in the UI, only
I would expect that killing a stage would kill the whole job. Are you not
seeing that happen?
On Mon, Dec 22, 2014 at 5:09 AM, Xiaoyu Wang wangxy...@gmail.com wrote:
Hello everyone!
Like the title.
I start the Spark SQL 1.2.0 thrift server. Use beeline connect to the
server to execute SQL.