Hi Sean,

We're not using log4j actually, we're trying to redirect all logging to
slf4j which then uses logback as the logging implementation.

The fix you mentioned - am I right to assume it is not part of the latest
released Spark version (0.9.0)? If so, are there any workarounds or advices
on how to avoid this issue in 0.9.0?

--
Best regards,
Sergey Parhomenko


On 5 March 2014 14:40, Sean Owen <so...@cloudera.com> wrote:

> Yes I think that issue is fixed (Patrick you had the last eyes on it IIRC?)
>
> If you are using log4j, in general, do not redirect log4j to slf4j.
> Stuff using log4j is already using log4j, done.
> --
> Sean Owen | Director, Data Science | London
>
>
> On Wed, Mar 5, 2014 at 1:12 PM, Sergey Parhomenko <sparhome...@gmail.com>
> wrote:
> > Hi,
> >
> > I'm trying to redirect Spark logs to slf4j. Spark seem to be using
> Log4J, so
> > I did the typical steps of forcing a Log4J-based framework to use slf4j -
> > manually excluded slf4j-log4j12 and log4j, and included log4j-over-slf4j.
> > When doing that however Spark starts failing on initialization with:
> > java.lang.StackOverflowError
> > at java.lang.ThreadLocal.access$400(ThreadLocal.java:72)
> > at java.lang.ThreadLocal$ThreadLocalMap.getEntry(ThreadLocal.java:376)
> > at java.lang.ThreadLocal$ThreadLocalMap.access$000(ThreadLocal.java:261)
> > at java.lang.ThreadLocal.get(ThreadLocal.java:146)
> > at java.lang.StringCoding.deref(StringCoding.java:63)
> > at java.lang.StringCoding.encode(StringCoding.java:330)
> > at java.lang.String.getBytes(String.java:916)
> > at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
> > at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:242)
> > at java.io.File.exists(File.java:813)
> > at sun.misc.URLClassPath$FileLoader.getResource(URLClassPath.java:1080)
> > at sun.misc.URLClassPath$FileLoader.findResource(URLClassPath.java:1047)
> > at sun.misc.URLClassPath.findResource(URLClassPath.java:176)
> > at java.net.URLClassLoader$2.run(URLClassLoader.java:551)
> > at java.net.URLClassLoader$2.run(URLClassLoader.java:549)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findResource(URLClassLoader.java:548)
> > at java.lang.ClassLoader.getResource(ClassLoader.java:1147)
> > at org.apache.spark.Logging$class.initializeLogging(Logging.scala:109)
> > at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:97)
> > at org.apache.spark.Logging$class.log(Logging.scala:36)
> > at org.apache.spark.util.Utils$.log(Utils.scala:47)
> > <last 4 lines repeated many, many times>
> >
> > There's some related work done in SPARK-1071, but it was resolved after
> > 0.9.0 was released. In the last comment Sean refers to a
> StackOverflowError
> > which was discussed in the mailing list, I assume it might be a problem
> > similar to mine but I was not able to find that discussion.
> > Is anyone aware of a way to redirect Spark 0.9.0 logs to slf4j?
> >
> > --
> > Best regards,
> > Sergey Parhomenko
>

Reply via email to