I am experiencing significant logging spam when running PySpark in IPython
Notebok

Exhibit A:  http://i.imgur.com/BDP0R2U.png

I have taken into consideration advice from:
http://apache-spark-user-list.1001560.n3.nabble.com/Disable-all-spark-logging-td1960.html

also

http://stackoverflow.com/questions/25193488/how-to-turn-off-info-logging-in-pyspark


I have only one log4j.properties it is in /opt/spark-1.1.0/conf

Just before I launch IPython Notebook with a pyspark profile, I add the dir
and the properties file directly to CLASSPATH and SPARK_CLASSPATH env vars
(as you can also see from the png)

I still haven't been able to make any change which disables this infernal
debug output.

Any ideas (WAGs, Solutions, commiserating)  would be greatly appreciated.

---

My log4j.properties:

log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
%c{1}: %m%n

# Change this to set Spark log level
log4j.logger.org.apache.spark=INFO

# Silence akka remoting
log4j.logger.Remoting=WARN

# Ignore messages below warning level from Jetty, because it's a bit verbose
log4j.logger.org.eclipse.jetty=WARN

Reply via email to