[repost to mailing list, ok I gotta really start hitting that
reply-to-all-button]

Hi,
Spark uses Log4j which unfortunately does not support fine-grained
configuration over the command line. Therefore some configuration file
editing will have to be done (unless you want to configure Loggers
programatically, which however would require editing spark-sql).
Nevertheless, there seems to be a kind of "trick" where you can substitute
java environment variables in the log4j configuration file. See this
stackoverflow answer for details http://stackoverflow.com/a/31208461/917519.
After editing the properties file, you can then start spark-sql with:

bin/spark-sql --conf
"spark.driver.extraJavaOptions=-Dmy.logger.threshold=OFF"

this is untested but I hop it helps,
--Jakob

On 15 October 2015 at 22:56, Muhammad Ahsan <muhammad.ah...@gmail.com>
wrote:

> Hello Everyone!
>
> I want to know how to turn off logging during starting *spark-sql shell*
> without changing log4j configuration files. In normal spark-shell I can use
> the following commands
>
> import org.apache.log4j.Loggerimport org.apache.log4j.Level
> Logger.getLogger("org").setLevel(Level.OFF)Logger.getLogger("akka").setLevel(Level.OFF)
>
>
> Thanks
>
> --
> Thanks
>
> Muhammad Ahsan
>
>

Reply via email to