Hi there,

we're haveing a strange Problem here using Spark in a Java application
using the JavaSparkContext:

We are using java.util.logging.* for logging in our application with 2
Handlers (Console + Filehandler):

{{{
.handlers=java.util.logging.ConsoleHandler, java.util.logging.FileHandler

.level = FINE

java.util.logging.ConsoleHandler.level=INFO
java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter

java.util.logging.FileHandler.level= FINE
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.FileHandler.limit=10240000
java.util.logging.FileHandler.count=5
java.util.logging.FileHandler.append= true
java.util.logging.FileHandler.pattern=%t/delivery-model.%u.%g.txt

java.util.logging.SimpleFormatter.format=%1$tY-%1$tm-%1$td
%1$tH:%1$tM:%1$tS %5$s%6$s%n
}}}

The thing is, that when the JavaSparcContext is started, the Logging stops.

The log4j.properties for spark looks like this:

{{{
log4j.rootLogger=WARN, theConsoleAppender
log4j.additivity.io.datapath=false
log4j.appender.theConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.theConsoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.theConsoleAppender.layout.ConversionPattern=%d{yyyy-MM-dd
HH:mm:ss} %m%n
}}}

Obviously iam not an expert in the Logging-Architecture yet, but i
really need to understand how the Handler of our JUL-Logging are changed
by the spark-library.

Thanks in advance!



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to