[ 
https://issues.apache.org/jira/browse/SPARK-14703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15247434#comment-15247434
 ] 

Sean Owen commented on SPARK-14703:
-----------------------------------

Oh, logback tries to reimplement some log4j API methods? It sounds like that 
isn't entirely binary compatible. My guess is that this ends up taking 
precedence over log4j, which is ideally where those calls still route to so 
that they do nothing.

Are you saying you want Spark to depend directly on logback to control log 
levels? Let's say logback has some other equivalent methods and you changed to 
use those in Spark. We get the same problem in the end, but just with logback? 
and then nobody's existing log4j config necessarily works anymore with Spark.

You can update log4j programmatically too (right?) but it does mean calling 
directly into it. 

Your goal is merely to use logback though for your own calls, perhaps. Normally 
if you can plumb SLF4J into logger X you can do that, but are you saying 
logback ends up colliding with log4j no matter which way you put this together?

I tried updating Spark to log4j 2.x and faced a bunch of problems, the most 
serious of which was: every single time a transitive dependency brings in log4j 
1.x classes again, it breaks until it's excluded again. But: if log4j 2 would 
somehow also work for you (being also a log4j 1 successor?) and you want to 
take a run at that again, I can look at that with you.

> Spark uses SLF4J, but actually relies quite heavily on Log4J
> ------------------------------------------------------------
>
>                 Key: SPARK-14703
>                 URL: https://issues.apache.org/jira/browse/SPARK-14703
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, YARN
>    Affects Versions: 1.6.0
>         Environment: 1.6.0-cdh5.7.0, logback 1.1.3, yarn
>            Reporter: Matthew Byng-Maddick
>            Priority: Minor
>              Labels: log4j, logback, logging, slf4j
>         Attachments: spark-logback.patch
>
>
> We've built a version of Hadoop CDH-5.7.0 in house with logback as the SLF4J 
> provider, in order to send hadoop logs straight to logstash (to handle with 
> logstash/elasticsearch), on top of our existing use of the logback backend.
> In trying to start spark-shell I discovered several points where the fact 
> that we weren't quite using a real L4J caused the sc not to be created or the 
> YARN module not to exist. There are many more places where we should probably 
> be wrapping the logging more sensibly, but I have a basic patch that fixes 
> some of the worst offenders (at least the ones that stop the sparkContext 
> being created properly).
> I'm prepared to accept that this is not a good solution and there probably 
> needs to be some sort of better wrapper, perhaps in the Logging.scala class 
> which handles this properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to