Ben Yu created SPARK-52414:
------------------------------

             Summary: Spark job fails after 
                 Key: SPARK-52414
                 URL: https://issues.apache.org/jira/browse/SPARK-52414
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 4.0.0, 3.0.0
            Reporter: Ben Yu


The *spark-job-history* folder is created when hadoop cluster is created. The 
path is stored in *spark.eventLog.dir* and *spark.history.fs.logDirectory* in 
{*}spark-defaults.conf{*}.

However, if the *spark-job-history* folder is deleted by some accidents, spark 
job will failed with below message:

ERROR SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: File not found: <PATH>/spark-job-history

 

It is better to check the existence of *spark-job-history* folder at the 
beginning of each job run. Re-create it if it does not exist. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to