Parth Gandhi created SPARK-21503:
------------------------------------

             Summary: Spark UI shows incorrect task status for a killed 
Executor Process
                 Key: SPARK-21503
                 URL: https://issues.apache.org/jira/browse/SPARK-21503
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Parth Gandhi
            Priority: Minor


The executor tab on Spark UI page shows task as completed when an executor 
process that is running that task is killed using the kill command.

Steps:
1. Run a big Spark job. As an example, I ran a pyspark job with the following 
command:
$SPARK_HOME/bin/spark-submit --master yarn --deploy-mode cluster --queue 
default --num-executors 10 --driver-memory 2G --conf 
spark.pyspark.driver.python=./Python3/bin/python --conf 
spark.pyspark.python=./Python3/bin/python --archives 
hdfs:///user/USERNAME/Python3.zip#Python3 ~/pi.py

2. Go to the UI to see which executors are running.

3. Do an ssh to each of the executor hosts and kill the java process running on 
the respective port mentioned in the UI using the following command:

kill <pid> OR kill -9 <pid>



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to