Re: log4j custom appender ClassNotFoundException with spark 1.4.1
One possible solution is to spark-submit with --driver-class-path and list all recursive dependencies. This is fragile and error prone. Non-working alternatives (used in SparkSubmit.scala AFTER arguments parser is initialized): spark-submit --packages ... spark-submit --jars ... spark-defaults.conf (spark.driver.extraJavaOptions, spark.jars, spark.driver.extraClassPath, ...) On Fri, Aug 7, 2015 at 8:57 AM, mlemay [via Apache Spark User List] ml-node+s1001560n24169...@n3.nabble.com wrote: That starts to smell... When analyzing SparkSubmit.scala, we can see than one of the firsts thing it does is to parse arguments. This uses Utils object and triggers initialization of member variables. One such variable is ShutdownHookManager (which didn't exists in spark 1.3) with the later log4j initialization. setContextClassLoader is set only a few steps after argument parsing in submit doRunMain runMain.. That pretty much sums it up: spark.util.Utils has a new static dependency on log4j that triggers it's initialization before the call to setContextClassLoader(MutableURLClassLoader) Anyone has a workaround to make this work in 1.4.1? -- If you reply to this email, your message will be added to the discussion below: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html To unsubscribe from log4j custom appender ClassNotFoundException with spark 1.4.1, click here http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=24159code=bWxlbWF5QGdtYWlsLmNvbXwyNDE1OXwtMTk2MTgzMjQzNg== . NAML http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24170.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: log4j custom appender ClassNotFoundException with spark 1.4.1
Offending commit is : [SPARK-6014] [core] Revamp Spark shutdown hooks, fix shutdown races. https://github.com/apache/spark/commit/e72c16e30d85cdc394d318b5551698885cfda9b8 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24171.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: log4j.xml bundled in jar vs log4.properties in spark/conf
See post for detailed explanation of you problem: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tt24159.html -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-xml-bundled-in-jar-vs-log4-properties-in-spark-conf-tp23923p24173.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: log4j custom appender ClassNotFoundException with spark 1.4.1
Looking at the callstack and diffs between 1.3.1 and 1.4.1-rc4, I see something that could be relevant to the issue. 1) Callstack tells that log4j manager gets initialized and uses default java context class loader. This context class loader should probably be MutableURLClassLoader from spark but it's not. We can assume that currentThread.setContextClassLoader has not been called yet. 2) Still in the callstack, we can see that ShutdownHookManager is the class object responsible to trigger log4j initialization. 3) Looking at the diffs between 1.3 and 1.4, we can see that this ShutdownHookManager is a new class object. With this information, is it possible that ShutdownHookManager makes log4j initialize too early? By that, I mean before spark gets the chance to set it's MutableURLClassLoader on thread context? Let me know if it does not make sense. Mike -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24168.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: log4j custom appender ClassNotFoundException with spark 1.4.1
That starts to smell... When analyzing SparkSubmit.scala, we can see than one of the firsts thing it does is to parse arguments. This uses Utils object and triggers initialization of member variables. One such variable is ShutdownHookManager (which didn't exists in spark 1.3) with the later log4j initialization. setContextClassLoader is set only a few steps after argument parsing in submit doRunMain runMain.. That pretty much sums it up: spark.util.Utils has a new static dependency on log4j that triggers it's initialization before the call to setContextClassLoader(MutableURLClassLoader) Anyone has a workaround to make this work in 1.4.1? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-custom-appender-ClassNotFoundException-with-spark-1-4-1-tp24159p24169.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: log4j.xml bundled in jar vs log4.properties in spark/conf
I'm having the same problem here. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/log4j-xml-bundled-in-jar-vs-log4-properties-in-spark-conf-tp23923p24158.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org