[ 
https://issues.apache.org/jira/browse/SPARK-14703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15247592#comment-15247592
 ] 

Ceki Gulcu commented on SPARK-14703:
------------------------------------

@srowen Being able to configure loggers has been a oft-requested feature for 
SLF4J. Can you briefly describe the *essential* configuration primitives you 
would like SLF4J support? 

> Spark uses SLF4J, but actually relies quite heavily on Log4J
> ------------------------------------------------------------
>
>                 Key: SPARK-14703
>                 URL: https://issues.apache.org/jira/browse/SPARK-14703
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, YARN
>    Affects Versions: 1.6.0
>         Environment: 1.6.0-cdh5.7.0, logback 1.1.3, yarn
>            Reporter: Matthew Byng-Maddick
>            Priority: Minor
>              Labels: log4j, logback, logging, slf4j
>         Attachments: spark-logback.patch
>
>
> We've built a version of Hadoop CDH-5.7.0 in house with logback as the SLF4J 
> provider, in order to send hadoop logs straight to logstash (to handle with 
> logstash/elasticsearch), on top of our existing use of the logback backend.
> In trying to start spark-shell I discovered several points where the fact 
> that we weren't quite using a real L4J caused the sc not to be created or the 
> YARN module not to exist. There are many more places where we should probably 
> be wrapping the logging more sensibly, but I have a basic patch that fixes 
> some of the worst offenders (at least the ones that stop the sparkContext 
> being created properly).
> I'm prepared to accept that this is not a good solution and there probably 
> needs to be some sort of better wrapper, perhaps in the Logging.scala class 
> which handles this properly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to