Hi all,

As Simon explained, you need to set "spark.eventLog.enabled" to true.

I'd like to add that the usage of SPARK_JAVA_OPTS to set spark
configurations is deprecated. I'm sure many of you have noticed this from
the scary warning message we print out. :) The recommended and supported
way of setting this is by adding the line "spark.eventLog.enabled true" to
$SPARK_HOME/conf/spark-defaults.conf. This will be picked up by Spark
submit and passed to your application.

Cheers,
Andrew


2014-08-14 15:45 GMT-07:00 durin <m...@simon-schaefer.net>:

> If I don't understand you wrong, setting event logging in the
> SPARK_JAVA_OPTS
> should achieve what you want. I'm logging to the HDFS, but according to the
> config page <http://spark.apache.org/docs/latest/configuration.html>   a
> folder should be possible as well.
>
> Example with all other settings removed:
>
> SPARK_JAVA_OPTS="-Dspark.eventLog.enabled=true
> -Dspark.eventLog.dir=hdfs://idp11:9100/user/myname/logs/"
>
> This works with the Spark shell, I haven't tested other applications
> though.
>
>
> Note that the completed applications will disappear from the list if you
> restart Spark completely, even though they'll still be stored in the log
> folder.
>
>
> Best regards,
> Simon
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-application-details-page-tp3490p12150.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to