Event logs are different from writing using a logger, like log4j. The event
logs are the type of data showing up in the history server.

For my team, we use com.typesafe.scalalogging.slf4j.Logging. Our logs show
up in /etc/spark/work/<app-id>/<executor-id>/stderr and stdout.

All of our logging seems to show up in stderr.

-Suren




On Tue, Jun 10, 2014 at 2:56 PM, coderxiang <shuoxiang...@gmail.com> wrote:

> By default, the logs are available at `/tmp/spark-events`. You can specify
> the log directory via spark.eventLog.dir, see  this configuration page
> <http://spark.apache.org/docs/latest/configuration.html>  .
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Logging-tp7340p7343.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>



-- 

SUREN HIRAMAN, VP TECHNOLOGY
Velos
Accelerating Machine Learning

440 NINTH AVENUE, 11TH FLOOR
NEW YORK, NY 10001
O: (917) 525-2466 ext. 105
F: 646.349.4063
E: suren.hiraman@v <suren.hira...@sociocast.com>elos.io
W: www.velos.io

Reply via email to