I don't get to activate the logs for my classes. I'm using CDH 5.4 with Spark 1.3.0
I have a class in Scala with some log.debug, I create a class to log: package example.spark import org.apache.log4j.Logger object Holder extends Serializable { @transient lazy val log = Logger.getLogger(getClass.getName) } And I use the log inside of a map function which it's executed in the executors. I'm looking for the logs in the executors (YARN). My log4j.properties is log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c - %m%n log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender log4j.appender.RollingAppender.File=/opt/centralLog/log/spark.log log4j.appender.RollingAppender.DatePattern='.'yyyy-MM-dd log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n log4j.rootLogger=INFO, myConsoleAppender, RollingAppender *log4j.logger.example.spark=DEBUG, RollingAppender, myConsoleAppender* And I created a script to execute Spark: #!/bin/bash export HADOOP_CONF_DIR=/etc/hadoop/conf export SPARK_CONF_DIR=/opt/centralLogs/conf SPARK_CLASSPATH="file:/etc/spark/conf.cloudera.spark_on_yarn/yarn-conf/" for lib in `ls /opt/centralLogs/lib/*.jar` do if [ -z "$SPARK_CLASSPATH" ]; then SPARK_CLASSPATH=$lib else SPARK_CLASSPATH=$SPARK_CLASSPATH,$lib fi done spark-submit --name "CentralLog" --master yarn-client --conf "spark.driver.extraJavaOptions=-*Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties" --class example.spark.CentralLog* --jars $SPARK_CLASSPATH, *file:/opt/centralLogs/conf/log4j.properties* --executor-memory 2g /opt/centralLogs/libProject/paas.jar XXXXX kafka-topic3 XXXXX,XXXXX,XXXXX I added *file:/opt/centralLogs/conf/log4j.properties, *but it's not working, I can't see the debug logs..