One way to integrate log4j2 would be to enable flags
`spark.executor.userClassPathFirst` and `spark.driver.userClassPathFirst`
when submitting the application. This would cause application class loader
to load first, initializing log4j2 logging context. But this can also
potentially break other things (like: dependencies that Spark master
required initializing overridden by Spark app and so on) so, you will need
to verify.

More info about those flags @
https://spark.apache.org/docs/latest/configuration.html


On Wed, Jun 22, 2016 at 7:11 AM, Charan Adabala <charan.ha...@gmail.com>
wrote:

> Hi,
>
> We are trying to integrate log4j2.xml instead of log4j.properties in Apache
> Spark application, We integrated log4j2.xml but, the problem is unable to
> write the worker log of the application and there is no problem for writing
> driver log. Can any one suggest how to integrate log4j2.xml in Apache Spark
> application with successful writing of both worker and driver log.
>
> Thanks in advance..,
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Can-I-use-log4j2-xml-in-my-Apache-Saprk-application-tp27205.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


-- 
--
Cheers,
Praj

Reply via email to