-defaults.conf (spark.driver.extraJavaOptions, spark.jars,
spark.driver.extraClassPath,
...)
On Fri, Aug 7, 2015 at 8:57 AM, mlemay [via Apache Spark User List]
ml-node+s1001560n24169...@n3.nabble.com wrote:
That starts to smell...
When analyzing SparkSubmit.scala, we can see than one of the firsts
Offending commit is :
[SPARK-6014] [core] Revamp Spark shutdown hooks, fix shutdown races.
https://github.com/apache/spark/commit/e72c16e30d85cdc394d318b5551698885cfda9b8
--
View this message in context:
See post for detailed explanation of you problem:
http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tt24159.html
--
View this message in context:
Looking at the callstack and diffs between 1.3.1 and 1.4.1-rc4, I see
something that could be relevant to the issue.
1) Callstack tells that log4j manager gets initialized and uses default java
context class loader. This context class loader should probably be
MutableURLClassLoader from spark but
That starts to smell...
When analyzing SparkSubmit.scala, we can see than one of the firsts thing it
does is to parse arguments. This uses Utils object and triggers
initialization of member variables. One such variable is
ShutdownHookManager (which didn't exists in spark 1.3) with the later
I'm having the same problem here.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/log4j-xml-bundled-in-jar-vs-log4-properties-in-spark-conf-tp23923p24158.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.