I changed the place of --files and works.

 ( IT DOESN'T WORK)
spark-submit  --conf spark.metrics.conf=metrics.properties --name
"myProject" --master yarn-cluster --class myCompany.spark.MyClass *--files
/opt/myProject/conf/log4j.properties* --jars $SPARK_CLASSPATH
--executor-memory 1024m --num-executors 5  --executor-cores 1
--driver-memory 1024m  /opt/myProject/myJar.jar

(IT WORKS)
spark-submit  --conf spark.metrics.conf=metrics.properties --name
"myProject" --master yarn-cluster --class myCompany.spark.MyClass  --jars
$SPARK_CLASSPATH --executor-memory 1024m --num-executors 5
 --executor-cores 1 --driver-memory 1024m *--files
/opt/myProject/conf/log4j.properties*  /opt/myProject/myJar.jar

I think I didn't do any others changes.



2016-03-30 15:42 GMT+02:00 Guillermo Ortiz <konstt2...@gmail.com>:

> I'm trying to configure log4j in Spark.
>
> spark-submit  --conf spark.metrics.conf=metrics.properties --name
> "myProject" --master yarn-cluster --class myCompany.spark.MyClass *--files
> /opt/myProject/conf/log4j.properties* --jars $SPARK_CLASSPATH
> --executor-memory 1024m --num-executors 5  --executor-cores 1
> --driver-memory 1024m  /opt/myProject/myJar.jar
>
> I have this log4j.properties
> log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender
> #log4j.logger.mycompany.spark=DEBUG
> log4j.category.myCompany.spark=DEBUG
> spark.log.dir=/opt/myProject/log
> spark.log.file=spark.log
>
> log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender
> log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout
> log4j.appender.myConsoleAppender.Target=System.out
> log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c
> - %m%n
>
> log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender
> log4j.appender.RollingAppender.MaxFileSize=50MB
> log4j.appender.RollingAppender.MaxBackupIndex=5
> log4j.appender.RollingAppender.layout.ConversionPattern=%d{dd MMM yyyy
> HH:mm:ss,SSS} %-5p [%t] (%C.%M:%L) %x - %m%n
> log4j.appender.RollingAppender.File=${spark.log.dir}/${spark.log.file}
> log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout
> log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M -
> %m%n
>
> With this I see the log driver with DEBUG level, but the executors with
> INFO level. Why can't I see the executor logs in INFO level?
> I'm using Spark 1.5.0
>
>
>

Reply via email to