Hi, Sounds like a quite involved development for me. I can't help here. I'd suggest going through the dev and user mailing lists for the past year and JIRA issues regarding the issue as I vaguely remember some discussions about logging in Spark (that would merit to do the migration to logback eventually).
Pozdrawiam, Jacek Laskowski ---- https://medium.com/@jaceklaskowski/ Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowski On Mon, Feb 6, 2017 at 9:06 AM, Mendelson, Assaf <assaf.mendel...@rsa.com> wrote: > Shading doesn’t help (we already shaded everything). > > According to https://www.slf4j.org/codes.html#multiple_bindings only one > binding can be used. The problem is that once we link to spark jars then we > automatically inherit spark’s binding (for log4j). > > I would like to find a way to either send spark’s logs to log4j and my logs > to logback or send everything to logback. > > Assaf. > > > > From: Jacek Laskowski [mailto:ja...@japila.pl] > Sent: Monday, February 06, 2017 12:47 AM > To: Mendelson, Assaf > Cc: user > Subject: Re: using an alternative slf4j implementation > > > > Hi, > > > > Shading conflicting dependencies? > > > > Jacek > > > > On 5 Feb 2017 3:56 p.m., "Mendelson, Assaf" <assaf.mendel...@rsa.com> wrote: > > Hi, > > Spark seems to explicitly use log4j. > > This means that if I use an alternative backend for my application (e.g. > ch.qos.logback) I have a conflict. > > Sure I can exclude logback but that means my application cannot use our > internal tools. > > > > Is there a way to use logback as a backend logging while using spark? > > Assaf. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org