This actually doesn't seem to work for executors. I have a file {{log4j.properties.debug}} with the following content:
{code} log4j.rootCategory=DEBUG, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Set the default spark-shell log level to WARN. When running the spark-shell, the # log level for this class is used to overwrite the root logger's log level, so that # the user can have different defaults for the shell and regular Spark apps. log4j.logger.org.apache.spark.repl.Main=INFO
# Settings to quiet third party logs that are too verbose log4j.logger.org.spark-project.jetty=WARN log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO log4j.logger.org.apache.parquet=ERROR log4j.logger.parquet=ERROR
# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR {code}
And I've run my job as follows:
{code} root@ip-10-0-6-74:/opt/spark/dist# ./bin/spark-shell --keytab $(pwd)/hadoop-install/keytabs/nn.10.0.2.keytab --principal nn/10.0.2.103@LOCAL --master mesos://leader.mesos:5050 --conf spark.mesos.executor.docker.image=mesosphere/spark:1.0.7-2.1.0-hadoop-2.7 --conf spark.mesos.executor.home=/opt/spark/dist --conf spark.mesos.uris=http://mgummelt-mesos.s3.amazonaws.com/log4j.properties.debug --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=/mnt/mesos/sandbox/log4j.properties.debug" {code}
I've verified that {{/mnt/mesos/sandbox/log4j.properties.debug}} exists in the executor's file system, and that the executor process is run with {{-Dlog4j.configuration=/mnt/mesos/sandbox/log4j.properties.debug}}
But debug logging is not enabled, and the executors print:
{code} root@ip-10-0-6-74 \Using Spark's default log4j profile : org / opt apache /spark/ dist# ./bin/spark log4j - shell --keytab $(pwd)/hadoop-install/keytabs/nn defaults . 10.0.2.keytab --principal nn properties 17 / 10.0.2.103@LOCAL --master mesos: 01 / /leader.mesos 30 02 : 5050 43:34 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 9@ip - 10 - conf spark.mesos.executor.docker.image=mesosphere/spark:1. 0 .7 -2 -159 . 1.0 us - hadoop west -2. 7 --conf spark compute . mesos.executor.home= internal 17 / opt 01 / spark/dist --conf spark.mesos.uris=http 30 02 : 43:34 INFO SignalUtils: Registered signal handler for TERM 17 / 01 / mgummelt-mesos.s3.amazonaws.com 30 02:43:34 INFO SignalUtils: Registered signal handler for HUP 17 / log4j.properties.debug --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration= 01 / mnt/mesos/sandbox/log4j.properties.debug" 30 02:43:34 INFO SignalUtils: Registered signal handler for INT {code}
|