I've been trying to get log4j2 and logback to get to play nice with Spark
1.6.0 so I can properly offload my logs to a remote server.

I've attempted the following things:

1. Setting logback/log4j2 on the class path for both the driver and worker
nodes
2. Passing -Dlog4j.configurationFile= and -Dlogback.configuration= flags to
-extraJavaOptions

log4j2 used to work on Spark 1.5.2. After we've upgraded, the default
logging framework defers to log4j 1.2. Even when I get Spark to work with
logback, it doesn't find my logback.xml file located in
%APP_DIRECTORY%/classes/logback.xml. 

I'm always seeing Spark defer to this:

"Reading configuration from URL
jar:file:/usr/lib/spark/lib/spark-assembly-1.6.0-hadoop2.7.1.jar!/org/apache/spark/log4j-defaults.properties"

Has anyone had similar issues with this?






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Setting-up-log4j2-logback-with-Spark-1-6-0-tp26518.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to