"spark.history.fs.logDirectory" is for the history server. For Spark
applications, they should use "spark.eventLog.dir". Since you commented out
"spark.eventLog.dir", it will be "/tmp/spark-events". And this folder does
not exits.

Best Regards,
Shixiong Zhu

2015-04-29 23:22 GMT-07:00 James King <jakwebin...@gmail.com>:

> I'm unclear why I'm getting this exception.
>
> It seems to have realized that I want to enable  Event Logging but
> ignoring where I want it to log to i.e. file:/opt/cb/tmp/spark-events which
> does exist.
>
> spark-default.conf
>
> # Example:
> spark.master                     spark://master1:7077,master2:7077
> spark.eventLog.enabled           true
> spark.history.fs.logDirectory    file:/opt/cb/tmp/spark-events
> # spark.eventLog.dir               hdfs://namenode:8021/directory
> # spark.serializer
> org.apache.spark.serializer.KryoSerializer
> # spark.driver.memory              5g
> # spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value
> -Dnumbers="one two three"
>
> Exception following job submission:
>
> spark.eventLog.enabled=true
> spark.history.fs.logDirectory=file:/opt/cb/tmp/spark-events
>
> spark.jars=file:/opt/cb/scripts/spark-streamer/cb-spark-streamer-1.0-SNAPSHOT.jar
> spark.master=spark://master1:7077,master2:7077
> Exception in thread "main" java.lang.IllegalArgumentException: Log
> directory /tmp/spark-events does not exist.
>         at
> org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:99)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:399)
>         at
> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:642)
>         at
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
>         at
> org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:132)
>
>
> Many Thanks
> jk
>

Reply via email to