I'm wondering how logging works in Spark.

I see that there's the log4j.properties.template file in the conf directory. 
Safe to assume Spark is using log4j 1?  What's the approach if we're using
log4j 2?  I've got a log4j2.xml file in the job jar which seems to be
working for my log statements but Spark's logging seems to be taking its own
default route despite me setting Spark's log to 'warn' only.

More interestingly, what happens if file-based loggers are at play?

If a log statement is in the driver program I assume it'll get logged into a
log file that's collocated with the driver. What about log statements in the
partition processing functions?  Will their log statements get logged into a
file residing on a given 'slave' machine, or will Spark capture this log
output and divert it into the log file of the driver's machine?

Thanks.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-and-logging-tp23049.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to