RE: Spark streaming filling the disk with logs

2019-02-14 Thread Jain, Abhishek 3. (Nokia - IN/Bangalore)
9 7:32 AM To: Jain, Abhishek 3. (Nokia - IN/Bangalore) ; 'Deepak Sharma' Cc: 'spark users' Subject: RE: Spark streaming filling the disk with logs I have a quick question about this configuration. Particularly this line : log4j.appender.rolling.file=/var/log/spark/ Where is that path at? At the dri

RE: Spark streaming filling the disk with logs

2019-02-14 Thread email
14, 2019 7:48 AM To: Deepak Sharma Cc: spark users Subject: RE: Spark streaming filling the disk with logs ++ If you can afford loosing few old logs, then you can make use of rolling file Appender as well. log4j.rootLogger=INFO, rolling log4j.appender.rolling

RE: Spark streaming filling the disk with logs

2019-02-14 Thread Jain, Abhishek 3. (Nokia - IN/Bangalore)
: Jain, Abhishek 3. (Nokia - IN/Bangalore) Sent: Thursday, February 14, 2019 5:58 PM To: Deepak Sharma Cc: spark users Subject: RE: Spark streaming filling the disk with logs Hi Deepak, The spark logging can be set for different purposes. Say for example if you want to control the spark-submit log

RE: Spark streaming filling the disk with logs

2019-02-14 Thread Jain, Abhishek 3. (Nokia - IN/Bangalore)
=, log4j.logger.org.apache.parquet= etc.. These properties can be set in the conf/log4j .properties file. Hope this helps!  Regards, Abhishek Jain From: Deepak Sharma Sent: Thursday, February 14, 2019 12:10 PM To: spark users Subject: Spark streaming filling the disk with logs Hi All I am running a spark

Spark streaming filling the disk with logs

2019-02-13 Thread Deepak Sharma
Hi All I am running a spark streaming job with below configuration : --conf "spark.executor.extraJavaOptions=-Droot.logger=WARN,console" But it’s still filling the disk with info logs. If the logging level is set to WARN at cluster level , then only the WARN logs are getting written but then it