relying on classpath loading is very brittle. You can use a system property 
(see https://logging.apache.org/log4j/1.2/manual.html ) to specify your own 
log4j file if you can

For example:  -Dlog4j.configuration=mylog4j.properties


> On 21 Jul 2015, at 00:57, igor.berman <igor.ber...@gmail.com> wrote:
> 
> Hi,
> I have log4j.xml in my jar
> From 1.4.1 it seems that log4j.properties in spark/conf is defined first in
> classpath so the spark.conf/log4j.properties "wins"
> before that (in v1.3.0) log4j.xml bundled in jar defined the configuration
> 
> if I manually add my jar to be strictly first in classpath(by adding it to
> SPARK_CLASSPATH in spark-env.sh) log4j.xml in jar wins
> 
> do somebody knows what changed? any ideas?
> ps: tried to use spark.driver.userClassPathFirst=true &
> spark.executor.userClassPathFirst=true, however I'm getting strange errors
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/log4j-xml-bundled-in-jar-vs-log4-properties-in-spark-conf-tp23923.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to