Shay Rojansky created SPARK-2972:
------------------------------------

             Summary: APPLICATION_COMPLETE not created in Python unless context 
explicitly stopped
                 Key: SPARK-2972
                 URL: https://issues.apache.org/jira/browse/SPARK-2972
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 1.0.2
         Environment: Cloudera 5.1, yarn master on ubuntu precise
            Reporter: Shay Rojansky


If you don't explicitly stop a SparkContext at the end of a Python application 
with sc.stop(), an APPLICATION_COMPLETE file isn't created and the job doesn't 
get picked up by the history server.

This can be easily reproduced with pyspark (but affects scripts as well).

The current workaround is to wrap the entire script with a try/finally and stop 
manually.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to