Hi All I am running a spark streaming job with below configuration : --conf "spark.executor.extraJavaOptions=-Droot.logger=WARN,console"
But it’s still filling the disk with info logs. If the logging level is set to WARN at cluster level , then only the WARN logs are getting written but then it affects all the jobs . Is there any way to get rid of INFO level of logging at spark streaming job level ? Thanks Deepak -- Thanks Deepak www.bigdatabig.com www.keosha.net