Re: using log4j2 with spark

2015-03-05 Thread Akhil Das
You may exclude the log4j dependency while building. You can have a look at
this build file to see how to exclude libraries
http://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/missing_dependencies_in_jar_files.html

Thanks
Best Regards

On Thu, Mar 5, 2015 at 1:20 PM, Lior Chaga lio...@taboola.com wrote:

 Hi,
 Trying to run spark 1.2.1 w/ hadoop 1.0.4 on cluster and configure it to
 run with log4j2.
 Problem is that spark-assembly.jar contains log4j and slf4j classes
 compatible with log4j 1.2 in it, and so it detects it should use log4j 1.2 (
 https://github.com/apache/spark/blob/54e7b456dd56c9e52132154e699abca87563465b/core/src/main/scala/org/apache/spark/Logging.scala
 on line 121).

 Is there a maven profile for building spark-assembly w/out the log4j
 dependencies, or any other way I can force spark to use log4j2?

 Thanks!
 Lior



using log4j2 with spark

2015-03-04 Thread Lior Chaga
Hi,
Trying to run spark 1.2.1 w/ hadoop 1.0.4 on cluster and configure it to
run with log4j2.
Problem is that spark-assembly.jar contains log4j and slf4j classes
compatible with log4j 1.2 in it, and so it detects it should use log4j 1.2 (
https://github.com/apache/spark/blob/54e7b456dd56c9e52132154e699abca87563465b/core/src/main/scala/org/apache/spark/Logging.scala
on line 121).

Is there a maven profile for building spark-assembly w/out the log4j
dependencies, or any other way I can force spark to use log4j2?

Thanks!
Lior