Found the issue, a conflict between setting Java options in both 
spark-defaults.conf and in the spark-submit.

--

    Nick


________________________________
From: Afshartous, Nick <nafshart...@turbine.com>
Sent: Friday, December 18, 2015 11:46 AM
To: user@spark.apache.org
Subject: Configuring log4j



Hi,


Am trying to configure log4j on an AWS EMR 4.2 Spark cluster for a streaming 
job set in client mode.


I changed


   /etc/spark/conf/log4j.properties


to use a FileAppender.  However the INFO logging still goes to console.


Thanks for any suggestions,

--

    Nick


>From the console:

Adding default property: 
spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///etc/spark/conf/log4j.properties
 -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 
-XX:MaxHeapFreeRatio=70 -XX:+CM\
SClassUnloadingEnabled -XX:MaxPermSize=512M -XX:OnOutOfMemoryError='kill -9 %p'

Reply via email to