[ https://issues.apache.org/jira/browse/SPARK-14881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Davies Liu resolved SPARK-14881. -------------------------------- Resolution: Fixed Fix Version/s: 2.0.0 Issue resolved by pull request 12648 [https://github.com/apache/spark/pull/12648] > pyspark and sparkR shell default log level should match spark-shell/Scala > ------------------------------------------------------------------------- > > Key: SPARK-14881 > URL: https://issues.apache.org/jira/browse/SPARK-14881 > Project: Spark > Issue Type: Bug > Components: PySpark, Spark Shell, SparkR > Affects Versions: 2.0.0 > Reporter: Felix Cheung > Priority: Minor > Fix For: 2.0.0 > > > Scala spark-shell defaults to log level WARN. pyspark and sparkR should match > that by default (user can change it later) > # ./bin/spark-shell > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org