Using spark 1.6.1
Spark Streaming Jobs are submitted via spark-submit (cluster mode)

I tried to kill drivers via webUI, it does not work. These drivers are
still running.
I also tried:
1. spark-submit --master <master-url> --kill <driver-id>
2. ./bin/spark-class org.apache.spark.deploy.Client kill <master url>
<driver ID>

Neither works. The workaround is to ssh to the driver node, then kill -9 ...
jsp shows the same classname DriverWrapper, so need to pick carefully...

Any idea why this happens ?
BTW, my streaming job's batch duration is one hour. So do we need to wait
for job processing to kill kill driver ?

-- 
Hao Ren

Data Engineer @ leboncoin

Paris, France

Reply via email to