zuotingbing created SPARK-22174: ----------------------------------- Summary: Support to automatically create the directory where the event logs go (`spark.eventLog.dir`) Key: SPARK-22174 URL: https://issues.apache.org/jira/browse/SPARK-22174 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 2.2.0 Reporter: zuotingbing Priority: Minor
{code:java} 2017-09-30 09:47:44,721 ERROR org.apache.spark.SparkContext: Error initializing SparkContext. java.io.FileNotFoundException: File file:/tmp/spark-events does not exist at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:611) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601) at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421) at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:93) at org.apache.spark.SparkContext.<init>(SparkContext.scala:516) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:846) at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:838) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:838) at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31) at org.apache.spark.examples.SparkPi.main(SparkPi.scala) {code} Currently, if our applications are using event logging, the directory where the event logs go (`spark.eventLog.dir`) should be manually created. I suggest to create the event log directory automatically in source code, this will make spark more convenient to use. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org