Can you submit a pull request for it? Thanks.

On Tue, Jun 2, 2015 at 4:25 AM, Mick Davies <michael.belldav...@gmail.com>
wrote:

> If I write unit tests that indirectly initialize
> org.apache.spark.util.Utils,
> for example use sql types, but produce no logging, I get the following
> unpleasant stack trace in my test output.
>
> This caused by the the Utils class adding a shutdown hook which logs the
> message logDebug("Shutdown hook called"). We are using log4j 2 for logging
> and if there has been no logging before this point then the static
> initialization of log4j 2 tries to add a shutdown hook itself but can't
> because JVM is already in shutdown.
>
> Its only slightly annoying but could be easily 'fixed' by adding a line
> like:
> logDebug("Adding shutdown hook)
> to Utils before adding the shutdown hook, so ensuring logging always
> initialized. I am happy to make this change, unless there is a better
> approach or considered too trivial.
>
> ERROR StatusLogger catching java.lang.IllegalStateException: Shutdown in
> progress
>         at
> java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:66)
>         at java.lang.Runtime.addShutdownHook(Runtime.java:211)
>         at
>
> org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry.addShutdownHook(DefaultShutdownCallbackRegistry.java:136)
>         at
>
> org.apache.logging.log4j.core.util.DefaultShutdownCallbackRegistry.start(DefaultShutdownCallbackRegistry.java:125)
>         at
>
> org.apache.logging.log4j.core.impl.Log4jContextFactory.initializeShutdownCallbackRegistry(Log4jContextFactory.java:123)
>         at
>
> org.apache.logging.log4j.core.impl.Log4jContextFactory.<init>(Log4jContextFactory.java:89)
>         at
>
> org.apache.logging.log4j.core.impl.Log4jContextFactory.<init>(Log4jContextFactory.java:54)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
>         at java.lang.Class.newInstance(Class.java:438)
>         at org.apache.logging.log4j.LogManager.<clinit>(LogManager.java:96)
>         at
>
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:102)
>         at
>
> org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
>         at
>
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
>         at
>
> org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
>         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
>         at org.apache.spark.Logging$class.log(Logging.scala:52)
>         at org.apache.spark.util.Utils$.log(Utils.scala:62)
>         at
> org.apache.spark.Logging$class.initializeLogging(Logging.scala:138)
>         at
> org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
>         at org.apache.spark.Logging$class.log(Logging.scala:51)
>         at org.apache.spark.util.Utils$.log(Utils.scala:62)
>         at org.apache.spark.Logging$class.logDebug(Logging.scala:63)
>         at org.apache.spark.util.Utils$.logDebug(Utils.scala:62)
>         at
>
> org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply$mcV$sp(Utils.scala:178)
>         at
> org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply(Utils.scala:177)
>         at
> org.apache.spark.util.Utils$$anon$4$$anonfun$run$1.apply(Utils.scala:177)
>         at
> org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1618)
>         at org.apache.spark.util.Utils$$anon$4.run(Utils.scala:177)
>
>
>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Unit-tests-can-generate-spurious-shutdown-messages-tp12557.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to