Perhaps you need to set this in your spark-defaults.conf so that¹s it¹s
already set when your slave/worker processes start.

-Joe

On 1/25/15, 6:50 PM, "ilaxes" <ila...@hotmail.com> wrote:

>Hi,
>
>I've a similar problem. I want to see the detailed logs of Completed
>Applications so I've set in my program :
>set("spark.eventLog.enabled","true").
>set("spark.eventLog.dir","file:/tmp/spark-events")
>
>but when I click on the application in the webui, I got a page with the
>message :
>Application history not found (app-20150126000651-0331)
>No event logs found for application xxx$ in
>file:/tmp/spark-events/xxx-1422227211500. Did you specify the correct
>logging directory?
>
>despite the fact that the directory exist and contains 3 files :
>APPLICATION_COMPLETE*
>EVENT_LOG_1*
>SPARK_VERSION_1.1.0*
>
>I use spark 1.1.0 on a standalone cluster with 3 nodes.
>
>Any suggestion to solve the problem ?
>
>
>Thanks.
>
>
>
>
>--
>View this message in context:
>http://apache-spark-user-list.1001560.n3.nabble.com/Spark-webUI-applicatio
>n-details-page-tp3490p21358.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to