9 7:32 AM
To: Jain, Abhishek 3. (Nokia - IN/Bangalore) ;
'Deepak Sharma'
Cc: 'spark users'
Subject: RE: Spark streaming filling the disk with logs
I have a quick question about this configuration. Particularly this line :
log4j.appender.rolling.file=/var/log/spark/
Where is that
14, 2019 7:48 AM
To: Deepak Sharma
Cc: spark users
Subject: RE: Spark streaming filling the disk with logs
++
If you can afford loosing few old logs, then you can make use of rolling file
Appender as well.
log4j.rootLogger=INFO, rolling
log4j.appender.rolling
: Jain, Abhishek 3. (Nokia - IN/Bangalore)
Sent: Thursday, February 14, 2019 5:58 PM
To: Deepak Sharma
Cc: spark users
Subject: RE: Spark streaming filling the disk with logs
Hi Deepak,
The spark logging can be set for different purposes. Say for example if you
want to control the spark-submit log
=,
log4j.logger.org.apache.parquet= etc..
These properties can be set in the conf/log4j .properties file.
Hope this helps! ๐
Regards,
Abhishek Jain
From: Deepak Sharma
Sent: Thursday, February 14, 2019 12:10 PM
To: spark users
Subject: Spark streaming filling the disk with logs
Hi All
I am running a spark
Hi All
I am running a spark streaming job with below configuration :
--conf "spark.executor.extraJavaOptions=-Droot.logger=WARN,console"
But itโs still filling the disk with info logs.
If the logging level is set to WARN at cluster level , then only the WARN
logs are getting written but then it a