Here’s an example. I echoed JAVA_OPTS so that you can see what I’ve got. Then I 
call ‘activator run’ in the project directory.


jjones-mac:analyzer-perf jjones$ echo $JAVA_OPTS

-Xmx4g -Xmx4g 
-Dlog4j.configuration=file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/conf/log4j.properties

jjones-mac:analyzer-perf jjones$ activator run

[info] Loading project definition from 
/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/project

[info] Set current project to analyzer-perf (in build 
file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/)

[info] Running com.adaptive.analyzer.perf.AnalyzerPerf

11:15:24.066 [run-main-0] INFO  org.apache.spark.SparkContext - Running Spark 
version 1.4.1

11:15:24.150 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, 
sampleName=Ops, type=DEFAULT, value=[Rate of successful kerberos logins and 
latency (milliseconds)], valueName=Time)

11:15:24.156 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, 
sampleName=Ops, type=DEFAULT, value=[Rate of failed kerberos logins and latency 
(milliseconds)], valueName=Time)

As I mentioned below but repeated for completeness, I also have this in my code.


import org.apache.log4j.PropertyConfigurator

PropertyConfigurator.configure("conf/log4j.properties")
Logger.getRootLogger().setLevel(Level.OFF)
Logger.getLogger("org").setLevel(Level.OFF)
Logger.getLogger("akka").setLevel(Level.OFF)

And here’s my log4j.properties (note, I’ve also tried setting the level to OFF):


# Set everything to be logged to the console

log4j.rootCategory=WARN

log4j.appender.console=org.apache.log4j.ConsoleAppender

log4j.appender.console.layout=org.apache.log4j.PatternLayout

log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: 
%m%n


# Change this to set Spark log level

log4j.logger.org.apache.spark=WARN


# Silence akka remoting

log4j.logger.Remoting=WARN


# Ignore messages below warning level from Jetty, because it's a bit verbose

log4j.logger.org.eclipse.jetty=WARN


spark.log.threshold=OFF

spark.root.logger=OFF,DRFA


From: Alex Kozlov
Date: Tuesday, October 6, 2015 at 10:50 AM
To: Jeff Jones
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: How can I disable logging when running local[*]?

Try

JAVA_OPTS='-Dlog4j.configuration=file:/<path-to-log4j.properties>'

Internally, this is just spark.driver.extraJavaOptions, which you should be 
able to set in conf/spark-defaults.conf

Can you provide more details how you invoke the driver?

On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
<jjo...@adaptivebiotech.com<mailto:jjo...@adaptivebiotech.com>> wrote:
Thanks. Any chance you know how to pass this to a Scala app that is run via 
TypeSafe activator?

I tried putting it $JAVA_OPTS but I get:

Unrecognized option: --driver-java-options

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.


I tried a bunch of different quoting but nothing produced a good result. I also 
tried passing it directly to activator using –jvm but it still produces the 
same results with verbose logging. Is there a way I can tell if it’s picking up 
my file?



From: Alex Kozlov
Date: Monday, October 5, 2015 at 8:34 PM
To: Jeff Jones
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: How can I disable logging when running local[*]?

Did you try “--driver-java-options 
'-Dlog4j.configuration=file:/<path-to-log4j.properties>'” and setting the 
log4j.rootLogger=FATAL,console?

On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
<jjo...@adaptivebiotech.com<mailto:jjo...@adaptivebiotech.com>> wrote:
I’ve written an application that hosts the Spark driver in-process using 
“local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve also 
tried putting the following code prior to creating my SparkContext. These were 
coupled together from various posts I’ve. None of these steps have worked. I’m 
still getting a ton of logging to the console. Anything else I can try?

Thanks,
Jeff

private def disableLogging(): Unit = {
  import org.apache.log4j.PropertyConfigurator

  PropertyConfigurator.configure("conf/log4j.properties")
  Logger.getRootLogger().setLevel(Level.OFF)
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
}


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.



--
Alex Kozlov
(408) 507-4987<tel:%28408%29%20507-4987>
(408) 830-9982<tel:%28408%29%20830-9982> fax
(650) 887-2135<tel:%28650%29%20887-2135> efax
ale...@gmail.com<mailto:ale...@gmail.com>


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.



This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.

Reply via email to