Re: disable log4j for spark-shell
Edit your conf/log4j.properties file and Change the following line: log4j.rootCategory=INFO, console to log4j.rootCategory=ERROR, console Another approach would be to : Fireup spark-shell and type in the following: import org.apache.log4j.Logger import org.apache.log4j.Level Logger.getLogger(org).setLevel(Level.OFF) Logger.getLogger(akka).setLevel(Level.OFF) You won't see any logs after that. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p21010.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
Another option is to make a copy of log4j.properties in the current directory where you start spark-shell from, and modify log4j.rootCategory=INFO, console to log4j.rootCategory=ERROR, console. Then start the shell. On Wed, Jan 7, 2015 at 3:39 AM, Akhil ak...@sigmoidanalytics.com wrote: Edit your conf/log4j.properties file and Change the following line: log4j.rootCategory=INFO, console to log4j.rootCategory=ERROR, console Another approach would be to : Fireup spark-shell and type in the following: import org.apache.log4j.Logger import org.apache.log4j.Level Logger.getLogger(org).setLevel(Level.OFF) Logger.getLogger(akka).setLevel(Level.OFF) You won't see any logs after that. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p21010.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
go to your spark home and then into the conf/ directory and then edit the log4j.properties file i.e. : gedit $SPARK_HOME/conf/log4j.properties and set root logger to: log4j.rootCategory=WARN, console U don't need to build spark for the changes to take place. Whenever you open spark-shel, it by default looks into the conf directories and loads all the properties. Thanks On Tue, Nov 11, 2014 at 6:34 AM, lordjoe lordjoe2...@gmail.com wrote: public static void main(String[] args) throws Exception { System.out.println(Set Log to Warn); Logger rootLogger = Logger.getRootLogger(); rootLogger.setLevel(Level.WARN); ... works for me -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18535.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Fwd: disable log4j for spark-shell
-- Forwarded message -- From: Ritesh Kumar Singh riteshoneinamill...@gmail.com Date: Tue, Nov 11, 2014 at 2:18 PM Subject: Re: disable log4j for spark-shell To: lordjoe lordjoe2...@gmail.com Cc: u...@spark.incubator.apache.org go to your spark home and then into the conf/ directory and then edit the log4j.properties file i.e. : gedit $SPARK_HOME/conf/log4j.properties and set root logger to: log4j.rootCategory=WARN, console U don't need to build spark for the changes to take place. Whenever you open spark-shel, it by default looks into the conf directories and loads all the properties. Thanks On Tue, Nov 11, 2014 at 6:34 AM, lordjoe lordjoe2...@gmail.com wrote: public static void main(String[] args) throws Exception { System.out.println(Set Log to Warn); Logger rootLogger = Logger.getRootLogger(); rootLogger.setLevel(Level.WARN); ... works for me -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18535.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
Tried --driver-java-options and SPARK_JAVA_OPTS, none of them worked Had to change the default one and rebuilt. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18513.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
Even after changing core/src/main/resources/org/apache/spark/log4j-defaults.properties to WARN followed by a rebuild, the log level is still INFO. Any other suggestions? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18518.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
Some console messages: 14/11/10 20:04:33 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:46713 14/11/10 20:04:33 INFO util.Utils: Successfully started service 'HTTP file server' on port 46713. 14/11/10 20:04:34 INFO server.Server: jetty-8.y.z-SNAPSHOT 14/11/10 20:04:34 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 14/11/10 20:04:34 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 14/11/10 20:04:34 INFO netty.NettyBlockTransferService: Server created on 46997 14/11/10 20:04:34 INFO storage.BlockManagerMaster: Trying to register BlockManager 14/11/10 20:04:34 INFO storage.BlockManagerMasterActor: Registering block manager localhost:46997 with 265.0 MB RAM, BlockManagerId(driver, localhost, 46997) 14/11/10 20:04:35 INFO storage.BlockManagerMaster: Registered BlockManager and the log4j-default.properties looks like: cat core/src/main/resources/org/apache/spark/log4j-defaults.properties # Set everything to be logged to the console log4j.rootCategory=WARN, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Settings to quiet third party logs that are too verbose log4j.logger.org.eclipse.jetty=WARN log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=WARN log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=WARN Any suggestions? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18520.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
public static void main(String[] args) throws Exception { System.out.println(Set Log to Warn); Logger rootLogger = Logger.getRootLogger(); rootLogger.setLevel(Level.WARN); ... works for me -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p18535.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
You just have to tell Spark which log4j properties file to use. I think --driver-java-options=-Dlog4j.configuration=log4j.properties should work but it didn't for me. set SPARK_JAVA_OPTS=-Dlog4j.configuration=log4j.properties did work though (this was on Windows, in local mode, assuming you put a file called log4j.properties in the bin directory where you run the shell from). In either case, like Sean said -Dlog4j.configuration=... is the magic incantation, you just have to figure out how to pass it depending on what version of SPARK you're using ( I personally find setting PRINT_SPARK_LAUNCH_COMMAND=1 very very useful when I'm trying to track a property) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p12942.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
If someone doesn't have the access to do that is there any easy to specify a different properties file to be used? Patrick Wendell wrote If you want to customize the logging behavior - the simplest way is to copy conf/log4j.properties.tempate to conf/log4j.properties. Then you can go and modify the log level in there. The spark shells should pick this up. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p12850.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
That's just a template. Nothing consults that file by default. It's looking inside the Spark .jar. If you edit core/src/main/resources/org/apache/spark/log4j-defaults.properties and rebuild Spark, it will pick up those changes. I think you could also use the JVM argument -Dlog4j.configuration=conf/log4j-defaults.properties to force it to look at your local, edited props file. Someone may have to correct me, but I think that in master right now, that means using --driver-java-options=.. to set an argument to the JVM that runs the shell? On Sun, Aug 3, 2014 at 2:07 PM, Gil Vernik g...@il.ibm.com wrote: Hi, I would like to run spark-shell without any INFO messages printed. To achieve this I edited /conf/log4j.properties and added line log4j.rootLogger=OFF that suppose to disable all logging. However, when I run ./spark-shell I see the message 4/08/03 16:02:15 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties And then spark-shell prints all INFO messages as is. What did i missed? Why spark-shell uses default log4j properties and not the one defined in /conf directory? Is there another solution to prevent spark-shell from printing INFO messages? Thanks, Gil. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: disable log4j for spark-shell
If you want to customize the logging behavior - the simplest way is to copy conf/log4j.properties.tempate to conf/log4j.properties. Then you can go and modify the log level in there. The spark shells should pick this up. On Sun, Aug 3, 2014 at 6:16 AM, Sean Owen so...@cloudera.com wrote: That's just a template. Nothing consults that file by default. It's looking inside the Spark .jar. If you edit core/src/main/resources/org/apache/spark/log4j-defaults.properties and rebuild Spark, it will pick up those changes. I think you could also use the JVM argument -Dlog4j.configuration=conf/log4j-defaults.properties to force it to look at your local, edited props file. Someone may have to correct me, but I think that in master right now, that means using --driver-java-options=.. to set an argument to the JVM that runs the shell? On Sun, Aug 3, 2014 at 2:07 PM, Gil Vernik g...@il.ibm.com wrote: Hi, I would like to run spark-shell without any INFO messages printed. To achieve this I edited /conf/log4j.properties and added line log4j.rootLogger=OFF that suppose to disable all logging. However, when I run ./spark-shell I see the message 4/08/03 16:02:15 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties And then spark-shell prints all INFO messages as is. What did i missed? Why spark-shell uses default log4j properties and not the one defined in /conf directory? Is there another solution to prevent spark-shell from printing INFO messages? Thanks, Gil. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org