Re: How can I disable logging when running local[*]?

2015-10-07 Thread Dean Wampler
Try putting the following in the SBT build file, e.g., ./build.sbt:

// Fork when running, so you can ...

fork := true

// ... add a JVM option to use when forking a JVM for 'run'
javaOptions += "-Dlog4j.configuration=file:/"

dean


Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Wed, Oct 7, 2015 at 10:39 AM, Alex Kozlov  wrote:

> Hmm, clearly the parameter is not passed to the program.  This should be
> an activator issue.  I wonder how do you specify the other parameters, like
> driver memory, num cores, etc.?  Just out of curiosity, can you run a
> program:
>
> import org.apache.spark.SparkConf
> val out=new SparkConf(true).get("spark.driver.extraJavaOptions")
>
> in your env and see what the output is?
>
> Also, make sure spark-defaults.conf is on your classpath.
>
> On Tue, Oct 6, 2015 at 11:19 AM, Jeff Jones 
> wrote:
>
>> Here’s an example. I echoed JAVA_OPTS so that you can see what I’ve got.
>> Then I call ‘activator run’ in the project directory.
>>
>>
>> jjones-mac:analyzer-perf jjones$ echo $JAVA_OPTS
>>
>> -Xmx4g -Xmx4g
>> -Dlog4j.configuration=file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/conf/log4j.properties
>>
>> jjones-mac:analyzer-perf jjones$ activator run
>>
>> [info] Loading project definition from
>> /Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/project
>>
>> [info] Set current project to analyzer-perf (in build
>> file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/)
>>
>> [info] Running com.adaptive.analyzer.perf.AnalyzerPerf
>>
>> 11:15:24.066 [run-main-0] INFO  org.apache.spark.SparkContext - Running
>> Spark version 1.4.1
>>
>> 11:15:24.150 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field
>> org.apache.hadoop.metrics2.lib.MutableRate
>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
>> with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
>> always=false, sampleName=Ops, type=DEFAULT, value=[Rate of successful
>> kerberos logins and latency (milliseconds)], valueName=Time)
>>
>> 11:15:24.156 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field
>> org.apache.hadoop.metrics2.lib.MutableRate
>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
>> with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
>> always=false, sampleName=Ops, type=DEFAULT, value=[Rate of failed kerberos
>> logins and latency (milliseconds)], valueName=Time)
>>
>> As I mentioned below but repeated for completeness, I also have this in
>> my code.
>>
>> import org.apache.log4j.PropertyConfigurator
>>
>> PropertyConfigurator.configure("conf/log4j.properties")
>> Logger.getRootLogger().setLevel(Level.OFF)
>> Logger.getLogger("org").setLevel(Level.OFF)
>> Logger.getLogger("akka").setLevel(Level.OFF)
>>
>> And here’s my log4j.properties (note, I’ve also tried setting the level
>> to OFF):
>>
>> # Set everything to be logged to the console
>>
>> log4j.rootCategory=WARN
>>
>> log4j.appender.console=org.apache.log4j.ConsoleAppender
>>
>> log4j.appender.console.layout=org.apache.log4j.PatternLayout
>>
>> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
>> %c{1}: %m%n
>>
>>
>> # Change this to set Spark log level
>>
>> log4j.logger.org.apache.spark=WARN
>>
>>
>> # Silence akka remoting
>>
>> log4j.logger.Remoting=WARN
>>
>>
>> # Ignore messages below warning level from Jetty, because it's a bit
>> verbose
>>
>> log4j.logger.org.eclipse.jetty=WARN
>>
>>
>> spark.log.threshold=OFF
>>
>> spark.root.logger=OFF,DRFA
>>
>>
>> From: Alex Kozlov
>> Date: Tuesday, October 6, 2015 at 10:50 AM
>>
>> To: Jeff Jones
>> Cc: "user@spark.apache.org"
>> Subject: Re: How can I disable logging when running local[*]?
>>
>> Try
>>
>> JAVA_OPTS='-Dlog4j.configuration=file:/'
>>
>> Internally, this is just spark.driver.extraJavaOptions, which you should
>> be able to set in conf/spark-defaults.conf
>>
>> Can you provide more details how you invoke the driver?
>>
>> On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
>> wrote:
>>
>>> Thanks. Any chan

Re: How can I disable logging when running local[*]?

2015-10-07 Thread Alex Kozlov
Hmm, clearly the parameter is not passed to the program.  This should be an
activator issue.  I wonder how do you specify the other parameters, like
driver memory, num cores, etc.?  Just out of curiosity, can you run a
program:

import org.apache.spark.SparkConf
val out=new SparkConf(true).get("spark.driver.extraJavaOptions")

in your env and see what the output is?

Also, make sure spark-defaults.conf is on your classpath.

On Tue, Oct 6, 2015 at 11:19 AM, Jeff Jones 
wrote:

> Here’s an example. I echoed JAVA_OPTS so that you can see what I’ve got.
> Then I call ‘activator run’ in the project directory.
>
>
> jjones-mac:analyzer-perf jjones$ echo $JAVA_OPTS
>
> -Xmx4g -Xmx4g
> -Dlog4j.configuration=file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/conf/log4j.properties
>
> jjones-mac:analyzer-perf jjones$ activator run
>
> [info] Loading project definition from
> /Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/project
>
> [info] Set current project to analyzer-perf (in build
> file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/)
>
> [info] Running com.adaptive.analyzer.perf.AnalyzerPerf
>
> 11:15:24.066 [run-main-0] INFO  org.apache.spark.SparkContext - Running
> Spark version 1.4.1
>
> 11:15:24.150 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
> always=false, sampleName=Ops, type=DEFAULT, value=[Rate of successful
> kerberos logins and latency (milliseconds)], valueName=Time)
>
> 11:15:24.156 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
> always=false, sampleName=Ops, type=DEFAULT, value=[Rate of failed kerberos
> logins and latency (milliseconds)], valueName=Time)
>
> As I mentioned below but repeated for completeness, I also have this in my
> code.
>
> import org.apache.log4j.PropertyConfigurator
>
> PropertyConfigurator.configure("conf/log4j.properties")
> Logger.getRootLogger().setLevel(Level.OFF)
> Logger.getLogger("org").setLevel(Level.OFF)
> Logger.getLogger("akka").setLevel(Level.OFF)
>
> And here’s my log4j.properties (note, I’ve also tried setting the level to
> OFF):
>
> # Set everything to be logged to the console
>
> log4j.rootCategory=WARN
>
> log4j.appender.console=org.apache.log4j.ConsoleAppender
>
> log4j.appender.console.layout=org.apache.log4j.PatternLayout
>
> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
> %c{1}: %m%n
>
>
> # Change this to set Spark log level
>
> log4j.logger.org.apache.spark=WARN
>
>
> # Silence akka remoting
>
> log4j.logger.Remoting=WARN
>
>
> # Ignore messages below warning level from Jetty, because it's a bit
> verbose
>
> log4j.logger.org.eclipse.jetty=WARN
>
>
> spark.log.threshold=OFF
>
> spark.root.logger=OFF,DRFA
>
>
> From: Alex Kozlov
> Date: Tuesday, October 6, 2015 at 10:50 AM
>
> To: Jeff Jones
> Cc: "user@spark.apache.org"
> Subject: Re: How can I disable logging when running local[*]?
>
> Try
>
> JAVA_OPTS='-Dlog4j.configuration=file:/'
>
> Internally, this is just spark.driver.extraJavaOptions, which you should
> be able to set in conf/spark-defaults.conf
>
> Can you provide more details how you invoke the driver?
>
> On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
> wrote:
>
>> Thanks. Any chance you know how to pass this to a Scala app that is run
>> via TypeSafe activator?
>>
>> I tried putting it $JAVA_OPTS but I get:
>>
>> Unrecognized option: --driver-java-options
>>
>> Error: Could not create the Java Virtual Machine.
>>
>> Error: A fatal exception has occurred. Program will exit.
>>
>>
>> I tried a bunch of different quoting but nothing produced a good result.
>> I also tried passing it directly to activator using –jvm but it still
>> produces the same results with verbose logging. Is there a way I can tell
>> if it’s picking up my file?
>>
>>
>>
>> From: Alex Kozlov
>> Date: Monday, October 5, 2015 at 8:34 PM
>> To: Jeff Jones
>> Cc: "user@spark.apache.org"
>> Subject: Re: How can I disable logging when running local[*]?
>>
>> Did you try “--driver-java-options
>> '-Dlog4j.configuration=file:/'” and setting the
>> log4j.rootLog

Re: How can I disable logging when running local[*]?

2015-10-06 Thread Jeff Jones
Here’s an example. I echoed JAVA_OPTS so that you can see what I’ve got. Then I 
call ‘activator run’ in the project directory.


jjones-mac:analyzer-perf jjones$ echo $JAVA_OPTS

-Xmx4g -Xmx4g 
-Dlog4j.configuration=file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/conf/log4j.properties

jjones-mac:analyzer-perf jjones$ activator run

[info] Loading project definition from 
/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/project

[info] Set current project to analyzer-perf (in build 
file:/Users/jjones/src/adaptive/adaptiveobjects/analyzer-perf/)

[info] Running com.adaptive.analyzer.perf.AnalyzerPerf

11:15:24.066 [run-main-0] INFO  org.apache.spark.SparkContext - Running Spark 
version 1.4.1

11:15:24.150 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, 
sampleName=Ops, type=DEFAULT, value=[Rate of successful kerberos logins and 
latency (milliseconds)], valueName=Time)

11:15:24.156 [run-main-0] DEBUG o.a.h.m.lib.MutableMetricsFactory - field 
org.apache.hadoop.metrics2.lib.MutableRate 
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with 
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, always=false, 
sampleName=Ops, type=DEFAULT, value=[Rate of failed kerberos logins and latency 
(milliseconds)], valueName=Time)

As I mentioned below but repeated for completeness, I also have this in my code.


import org.apache.log4j.PropertyConfigurator

PropertyConfigurator.configure("conf/log4j.properties")
Logger.getRootLogger().setLevel(Level.OFF)
Logger.getLogger("org").setLevel(Level.OFF)
Logger.getLogger("akka").setLevel(Level.OFF)

And here’s my log4j.properties (note, I’ve also tried setting the level to OFF):


# Set everything to be logged to the console

log4j.rootCategory=WARN

log4j.appender.console=org.apache.log4j.ConsoleAppender

log4j.appender.console.layout=org.apache.log4j.PatternLayout

log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: 
%m%n


# Change this to set Spark log level

log4j.logger.org.apache.spark=WARN


# Silence akka remoting

log4j.logger.Remoting=WARN


# Ignore messages below warning level from Jetty, because it's a bit verbose

log4j.logger.org.eclipse.jetty=WARN


spark.log.threshold=OFF

spark.root.logger=OFF,DRFA


From: Alex Kozlov
Date: Tuesday, October 6, 2015 at 10:50 AM
To: Jeff Jones
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: How can I disable logging when running local[*]?

Try

JAVA_OPTS='-Dlog4j.configuration=file:/'

Internally, this is just spark.driver.extraJavaOptions, which you should be 
able to set in conf/spark-defaults.conf

Can you provide more details how you invoke the driver?

On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
mailto:jjo...@adaptivebiotech.com>> wrote:
Thanks. Any chance you know how to pass this to a Scala app that is run via 
TypeSafe activator?

I tried putting it $JAVA_OPTS but I get:

Unrecognized option: --driver-java-options

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.


I tried a bunch of different quoting but nothing produced a good result. I also 
tried passing it directly to activator using –jvm but it still produces the 
same results with verbose logging. Is there a way I can tell if it’s picking up 
my file?



From: Alex Kozlov
Date: Monday, October 5, 2015 at 8:34 PM
To: Jeff Jones
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: How can I disable logging when running local[*]?

Did you try “--driver-java-options 
'-Dlog4j.configuration=file:/'” and setting the 
log4j.rootLogger=FATAL,console?

On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
mailto:jjo...@adaptivebiotech.com>> wrote:
I’ve written an application that hosts the Spark driver in-process using 
“local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve also 
tried putting the following code prior to creating my SparkContext. These were 
coupled together from various posts I’ve. None of these steps have worked. I’m 
still getting a ton of logging to the console. Anything else I can try?

Thanks,
Jeff

private def disableLogging(): Unit = {
  import org.apache.log4j.PropertyConfigurator

  PropertyConfigurator.configure("conf/log4j.properties")
  Logger.getRootLogger().setLevel(Level.OFF)
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
}


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated 

Re: How can I disable logging when running local[*]?

2015-10-06 Thread Alexander Pivovarov
The easiest way to control logging in spark shell is to run Logger.setLevel
commands at the beginning of your program
e.g.

org.apache.log4j.Logger.getLogger("com.amazon").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("com.amazonaws").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("amazon.emr").setLevel(org.apache.log4j.Level.WARN)
org.apache.log4j.Logger.getLogger("akka").setLevel(org.apache.log4j.Level.WARN)

On Tue, Oct 6, 2015 at 10:50 AM, Alex Kozlov  wrote:

> Try
>
> JAVA_OPTS='-Dlog4j.configuration=file:/'
>
> Internally, this is just spark.driver.extraJavaOptions, which you should
> be able to set in conf/spark-defaults.conf
>
> Can you provide more details how you invoke the driver?
>
> On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
> wrote:
>
>> Thanks. Any chance you know how to pass this to a Scala app that is run
>> via TypeSafe activator?
>>
>> I tried putting it $JAVA_OPTS but I get:
>>
>> Unrecognized option: --driver-java-options
>>
>> Error: Could not create the Java Virtual Machine.
>>
>> Error: A fatal exception has occurred. Program will exit.
>>
>>
>> I tried a bunch of different quoting but nothing produced a good result.
>> I also tried passing it directly to activator using –jvm but it still
>> produces the same results with verbose logging. Is there a way I can tell
>> if it’s picking up my file?
>>
>>
>>
>> From: Alex Kozlov
>> Date: Monday, October 5, 2015 at 8:34 PM
>> To: Jeff Jones
>> Cc: "user@spark.apache.org"
>> Subject: Re: How can I disable logging when running local[*]?
>>
>> Did you try “--driver-java-options
>> '-Dlog4j.configuration=file:/'” and setting the
>> log4j.rootLogger=FATAL,console?
>>
>> On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
>> wrote:
>>
>>> I’ve written an application that hosts the Spark driver in-process using
>>> “local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve
>>> also tried putting the following code prior to creating my SparkContext.
>>> These were coupled together from various posts I’ve. None of these steps
>>> have worked. I’m still getting a ton of logging to the console. Anything
>>> else I can try?
>>>
>>> Thanks,
>>> Jeff
>>>
>>> private def disableLogging(): Unit = {
>>>   import org.apache.log4j.PropertyConfigurator
>>>
>>>   PropertyConfigurator.configure("conf/log4j.properties")
>>>   Logger.getRootLogger().setLevel(Level.OFF)
>>>   Logger.getLogger("org").setLevel(Level.OFF)
>>>   Logger.getLogger("akka").setLevel(Level.OFF)
>>> }
>>>
>>>
>>>
>>> This message (and any attachments) is intended only for the designated
>>> recipient(s). It
>>> may contain confidential or proprietary information, or have other
>>> limitations on use as
>>> indicated by the sender. If you are not a designated recipient, you may
>>> not review, use,
>>> copy or distribute this message. If you received this in error, please
>>> notify the sender by
>>> reply e-mail and delete this message.
>>>
>>
>>
>>
>> --
>> Alex Kozlov
>> (408) 507-4987
>> (408) 830-9982 fax
>> (650) 887-2135 efax
>> ale...@gmail.com
>>
>>
>> This message (and any attachments) is intended only for the designated
>> recipient(s). It
>> may contain confidential or proprietary information, or have other
>> limitations on use as
>> indicated by the sender. If you are not a designated recipient, you may
>> not review, use,
>> copy or distribute this message. If you received this in error, please
>> notify the sender by
>> reply e-mail and delete this message.
>>
>
>


Re: How can I disable logging when running local[*]?

2015-10-06 Thread Alex Kozlov
Try

JAVA_OPTS='-Dlog4j.configuration=file:/'

Internally, this is just spark.driver.extraJavaOptions, which you should be
able to set in conf/spark-defaults.conf

Can you provide more details how you invoke the driver?

On Tue, Oct 6, 2015 at 9:48 AM, Jeff Jones 
wrote:

> Thanks. Any chance you know how to pass this to a Scala app that is run
> via TypeSafe activator?
>
> I tried putting it $JAVA_OPTS but I get:
>
> Unrecognized option: --driver-java-options
>
> Error: Could not create the Java Virtual Machine.
>
> Error: A fatal exception has occurred. Program will exit.
>
>
> I tried a bunch of different quoting but nothing produced a good result. I
> also tried passing it directly to activator using –jvm but it still
> produces the same results with verbose logging. Is there a way I can tell
> if it’s picking up my file?
>
>
>
> From: Alex Kozlov
> Date: Monday, October 5, 2015 at 8:34 PM
> To: Jeff Jones
> Cc: "user@spark.apache.org"
> Subject: Re: How can I disable logging when running local[*]?
>
> Did you try “--driver-java-options
> '-Dlog4j.configuration=file:/'” and setting the
> log4j.rootLogger=FATAL,console?
>
> On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
> wrote:
>
>> I’ve written an application that hosts the Spark driver in-process using
>> “local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve
>> also tried putting the following code prior to creating my SparkContext.
>> These were coupled together from various posts I’ve. None of these steps
>> have worked. I’m still getting a ton of logging to the console. Anything
>> else I can try?
>>
>> Thanks,
>> Jeff
>>
>> private def disableLogging(): Unit = {
>>   import org.apache.log4j.PropertyConfigurator
>>
>>   PropertyConfigurator.configure("conf/log4j.properties")
>>   Logger.getRootLogger().setLevel(Level.OFF)
>>   Logger.getLogger("org").setLevel(Level.OFF)
>>   Logger.getLogger("akka").setLevel(Level.OFF)
>> }
>>
>>
>>
>> This message (and any attachments) is intended only for the designated
>> recipient(s). It
>> may contain confidential or proprietary information, or have other
>> limitations on use as
>> indicated by the sender. If you are not a designated recipient, you may
>> not review, use,
>> copy or distribute this message. If you received this in error, please
>> notify the sender by
>> reply e-mail and delete this message.
>>
>
>
>
> --
> Alex Kozlov
> (408) 507-4987
> (408) 830-9982 fax
> (650) 887-2135 efax
> ale...@gmail.com
>
>
> This message (and any attachments) is intended only for the designated
> recipient(s). It
> may contain confidential or proprietary information, or have other
> limitations on use as
> indicated by the sender. If you are not a designated recipient, you may
> not review, use,
> copy or distribute this message. If you received this in error, please
> notify the sender by
> reply e-mail and delete this message.
>


Re: How can I disable logging when running local[*]?

2015-10-06 Thread Jeff Jones
Thanks. Any chance you know how to pass this to a Scala app that is run via 
TypeSafe activator?

I tried putting it $JAVA_OPTS but I get:

Unrecognized option: --driver-java-options

Error: Could not create the Java Virtual Machine.

Error: A fatal exception has occurred. Program will exit.


I tried a bunch of different quoting but nothing produced a good result. I also 
tried passing it directly to activator using –jvm but it still produces the 
same results with verbose logging. Is there a way I can tell if it’s picking up 
my file?



From: Alex Kozlov
Date: Monday, October 5, 2015 at 8:34 PM
To: Jeff Jones
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Re: How can I disable logging when running local[*]?

Did you try “--driver-java-options 
'-Dlog4j.configuration=file:/'” and setting the 
log4j.rootLogger=FATAL,console?

On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
mailto:jjo...@adaptivebiotech.com>> wrote:
I’ve written an application that hosts the Spark driver in-process using 
“local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve also 
tried putting the following code prior to creating my SparkContext. These were 
coupled together from various posts I’ve. None of these steps have worked. I’m 
still getting a ton of logging to the console. Anything else I can try?

Thanks,
Jeff

private def disableLogging(): Unit = {
  import org.apache.log4j.PropertyConfigurator

  PropertyConfigurator.configure("conf/log4j.properties")
  Logger.getRootLogger().setLevel(Level.OFF)
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
}


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.



--
Alex Kozlov
(408) 507-4987
(408) 830-9982 fax
(650) 887-2135 efax
ale...@gmail.com<mailto:ale...@gmail.com>


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.


Re: How can I disable logging when running local[*]?

2015-10-05 Thread Alex Kozlov
Did you try “--driver-java-options
'-Dlog4j.configuration=file:/'” and setting the
log4j.rootLogger=FATAL,console?

On Mon, Oct 5, 2015 at 8:19 PM, Jeff Jones 
wrote:

> I’ve written an application that hosts the Spark driver in-process using
> “local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve
> also tried putting the following code prior to creating my SparkContext.
> These were coupled together from various posts I’ve. None of these steps
> have worked. I’m still getting a ton of logging to the console. Anything
> else I can try?
>
> Thanks,
> Jeff
>
> private def disableLogging(): Unit = {
>   import org.apache.log4j.PropertyConfigurator
>
>   PropertyConfigurator.configure("conf/log4j.properties")
>   Logger.getRootLogger().setLevel(Level.OFF)
>   Logger.getLogger("org").setLevel(Level.OFF)
>   Logger.getLogger("akka").setLevel(Level.OFF)
> }
>
>
>
> This message (and any attachments) is intended only for the designated
> recipient(s). It
> may contain confidential or proprietary information, or have other
> limitations on use as
> indicated by the sender. If you are not a designated recipient, you may
> not review, use,
> copy or distribute this message. If you received this in error, please
> notify the sender by
> reply e-mail and delete this message.
>



-- 
Alex Kozlov
(408) 507-4987
(408) 830-9982 fax
(650) 887-2135 efax
ale...@gmail.com


How can I disable logging when running local[*]?

2015-10-05 Thread Jeff Jones
I’ve written an application that hosts the Spark driver in-process using 
“local[*]”. I’ve turned off logging in my conf/log4j.properties file. I’ve also 
tried putting the following code prior to creating my SparkContext. These were 
coupled together from various posts I’ve. None of these steps have worked. I’m 
still getting a ton of logging to the console. Anything else I can try?

Thanks,
Jeff

private def disableLogging(): Unit = {
  import org.apache.log4j.PropertyConfigurator

  PropertyConfigurator.configure("conf/log4j.properties")
  Logger.getRootLogger().setLevel(Level.OFF)
  Logger.getLogger("org").setLevel(Level.OFF)
  Logger.getLogger("akka").setLevel(Level.OFF)
}


This message (and any attachments) is intended only for the designated 
recipient(s). It
may contain confidential or proprietary information, or have other limitations 
on use as
indicated by the sender. If you are not a designated recipient, you may not 
review, use,
copy or distribute this message. If you received this in error, please notify 
the sender by
reply e-mail and delete this message.