Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
-defaults.conf (spark.driver.extraJavaOptions, spark.jars, spark.driver.extraClassPath, ...) On Fri, Aug 7, 2015 at 8:57 AM, mlemay [via Apache Spark User List] ml-node+s1001560n24169...@n3.nabble.com wrote: That starts to smell... When analyzing SparkSubmit.scala, we can see than one of the firsts

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
Offending commit is : [SPARK-6014] [core] Revamp Spark shutdown hooks, fix shutdown races. https://github.com/apache/spark/commit/e72c16e30d85cdc394d318b5551698885cfda9b8 -- View this message in context:

Re: log4j.xml bundled in jar vs log4.properties in spark/conf

2015-08-07 Thread mlemay
See post for detailed explanation of you problem: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tt24159.html -- View this message in context:

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
Looking at the callstack and diffs between 1.3.1 and 1.4.1-rc4, I see something that could be relevant to the issue. 1) Callstack tells that log4j manager gets initialized and uses default java context class loader. This context class loader should probably be MutableURLClassLoader from spark but

Re: log4j custom appender ClassNotFoundException with spark 1.4.1

2015-08-07 Thread mlemay
That starts to smell... When analyzing SparkSubmit.scala, we can see than one of the firsts thing it does is to parse arguments. This uses Utils object and triggers initialization of member variables. One such variable is ShutdownHookManager (which didn't exists in spark 1.3) with the later

Re: log4j.xml bundled in jar vs log4.properties in spark/conf

2015-08-06 Thread mlemay
I'm having the same problem here. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-xml-bundled-in-jar-vs-log4-properties-in-spark-conf-tp23923p24158.html Sent from the Apache Spark User List mailing list archive at Nabble.com.