Another option is to make a copy of log4j.properties in the current
directory where you start spark-shell from, and modify
"log4j.rootCategory=INFO,
console" to "log4j.rootCategory=ERROR, console". Then start the shell.

On Wed, Jan 7, 2015 at 3:39 AM, Akhil <ak...@sigmoidanalytics.com> wrote:

> Edit your conf/log4j.properties file and Change the following line:
>
>        log4j.rootCategory=INFO, console
>
> to
>
>         log4j.rootCategory=ERROR, console
>
> Another approach would be to :
>
> Fireup spark-shell and type in the following:
>
>     import org.apache.log4j.Logger
>     import org.apache.log4j.Level
>
>     Logger.getLogger("org").setLevel(Level.OFF)
>     Logger.getLogger("akka").setLevel(Level.OFF)
>
> You won't see any logs after that.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/disable-log4j-for-spark-shell-tp11278p21010.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to