Hi,
Please can somebody help ,how to avoid Spark and Hive log from Application
log,
I mean both spark and hive are using log4j property file ,
I have configured log4j.property file as per my application as under but its
printing Spark and hive console logging also,please suggest its urgent for
me, I am running application in HDFS environment

log4j.rootLogger=DEBUG,debugLog, SplLog

log4j.appender.debugLog=org.apache.log4j.RollingFileAppender
log4j.appender.debugLog.File=logs/Debug.log
log4j.appender.debugLog.MaxFileSize=10MB
log4j.appender.debugLog.MaxBackupIndex=10
log4j.appender.debugLog.layout=org.apache.log4j.PatternLayout
log4j.appender.debugLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}
%-5p %c{1} - %m%n
log4j.appender.debugLog.filter.f1=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.debugLog.filter.f1.LevelMax=DEBUG
log4j.appender.debugLog.filter.f1.LevelMin=DEBUG

log4j.appender.SplLog=org.apache.log4j.RollingFileAppender
log4j.appender.SplLog.File=logs/AppSplCmd.log
log4j.appender.SplLog.MaxFileSize=10MB
log4j.appender.SplLog.MaxBackupIndex=10
log4j.appender.SplLog.layout=org.apache.log4j.PatternLayout
log4j.appender.SplLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p
%c{1} - %m%n
log4j.appender.SplLog.filter.f1=org.apache.log4j.varia.LevelRangeFilter
log4j.appender.SplLog.filter.f1.LevelMax=FATAL
log4j.appender.SplLog.filter.f1.LevelMin=INFO

log4j.logger.debugLogger=DEBUG, debugLog
log4j.additivity.debugLogger=false

log4j.logger.AppSplLogger=INFO, SplLog
log4j.additivity.AppSplLogger=false


Thanks in advance,







--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-avoid-Spark-and-Hive-log-from-Application-log-tp21615.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to