[ 
https://issues.apache.org/jira/browse/SPARK-6305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16523888#comment-16523888
 ] 

Hari Sekhon commented on SPARK-6305:
------------------------------------

Log4j 2.x would really help with Spark logging integration to ELK as there are 
a lot of things that just don't work properly in Log4j 1.x like 
layout.ConversionPattern for constructing JSON enriched logs, such as logging 
user and app names to distinguish jobs and provide much needed search 
usability. This is simply ignored in the SocketAppender in Log4j 1.x :-/ while 
SyslogAppender respects ConversionPattern but then splits all Java Exceptions 
in to multiple syslog logs so the JSON no longer parses and routes to the right 
indices for the Yarn queue, nor can you reassemble the exception logs using 
multiline codec at the other end as you'd end up with corrupted input streams 
from multiple loggers) :-/

Running Filebeats everywhere instead seems like overkill compared to being able 
to enable logging for debugging jobs on an ad-hoc basis to a Logstash sink that 
works using much better Log4j 2.x output appenders.

I hope someone finally manages to sort this out as it's years overdue given 
Log4j 1.x was end of life 3 years ago and there is a big jump in capabilities 
between Log4j 1.x and 2.x, both in the number of appenders as well as 
completeness of even the old appenders such as the SocketAppender as mentioned 
above.

> Add support for log4j 2.x to Spark
> ----------------------------------
>
>                 Key: SPARK-6305
>                 URL: https://issues.apache.org/jira/browse/SPARK-6305
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>            Reporter: Tal Sliwowicz
>            Priority: Minor
>
> log4j 2 requires replacing the slf4j binding and adding the log4j jars in the 
> classpath. Since there are shaded jars, it must be done during the build.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to