That's all right, i manage to reduce the log level by removing the
logback dependency in the pom.xml
On Sat, May 11, 2019 at 02:54:49PM +0200, Nicolas Paris wrote:
> Hi
>
> I have a spark code source with tests that create sparkSessions.
>
> I am running spark testing framework.
>
> My
Hi
I have a spark code source with tests that create sparkSessions.
I am running spark testing framework.
My concern is I am not able to configure the log level to INFO.
I have large debug traces such:
> DEBUG org.spark_project.jetty.util.Jetty -
> java.lang.NumberFormatException: For input
.1001560.n3.nabble.com/how-to-set-log-level-of-spark-executor-on-YARN-using-yarn-cluster-mode-tp16528p26505.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr
to set log level in spark-submit ? Thanks
mailto:ccn...@gmail.com escribió:
Anyone know how to set log level in spark-submit ? Thanks
Anyone know how to set log level in spark-submit ? Thanks
Put a log4j.properties file in conf/. You can copy
log4j.properties.template as a good base
El miércoles, 29 de julio de 2015, canan chen ccn...@gmail.com escribió:
Anyone know how to set log level in spark-submit ? Thanks
miércoles, 29 de julio de 2015, canan chen ccn...@gmail.com escribió:
Anyone know how to set log level in spark-submit ? Thanks
the log level of spark executor on YARN container to
DEBUG?
Thanks!
--
Wang Haihua
--
Marcelo
--
王海华
Hi,
I want to check the DEBUG log of spark executor on YARN(using yarn-cluster
mode), but
1. yarn daemonlog setlevel DEBUG YarnChild.class
2. set log4j.properties in spark/conf folder on client node.
no means above works.
So how could i set the log level of spark executor* on YARN container
how could i set the log level of spark executor on YARN container to
DEBUG?
Thanks!
--
Wang Haihua
--
Marcelo
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h
/configuration.html#configuring-logging
,
changing log-level is just a matter of creating a log4j.properties
(which is
in the classpath of spark) and changing log level there for the root
logger.
I did this steps on every node in the cluster (master and worker nodes).
However, after
,
changing log-level is just a matter of creating a log4j.properties
(which is
in the classpath of spark) and changing log level there for the root
logger.
I did this steps on every node in the cluster (master and worker
nodes).
However, after restart there is still no debug output as desired
is just a matter of creating a log4j.properties
(which is
in the classpath of spark) and changing log level there for the root
logger.
I did this steps on every node in the cluster (master and worker
nodes).
However, after restart there is still no debug output as desired, but
only
-logging,
changing log-level is just a matter of creating a log4j.properties (which is
in the classpath of spark) and changing log level there for the root logger.
I did this steps on every node in the cluster (master and worker nodes).
However, after restart there is still no debug output
,
changing log-level is just a matter of creating a log4j.properties
(which is
in the classpath of spark) and changing log level there for the root
logger.
I did this steps on every node in the cluster (master and worker nodes).
However, after restart there is still no debug output as desired
Hi!
According to
https://spark.apache.org/docs/0.9.0/configuration.html#configuring-logging,
changing log-level is just a matter of creating a log4j.properties (which
is in the classpath of spark) and changing log level there for the root
logger. I did this steps on every node in the cluster
17 matches
Mail list logo