Hyukjin Kwon created SPARK-37729: ------------------------------------ Summary: SparkSession.setLogLevel not working in Spark Shell Key: SPARK-37729 URL: https://issues.apache.org/jira/browse/SPARK-37729 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 3.3.0 Reporter: Hyukjin Kwon
In Spark 3.2: {code} scala> import org.apache.spark.sql.SparkSession import org.apache.spark.sql.SparkSession scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res1: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9 scala> spark.sparkContext.setLogLevel("WARN") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate 21/12/23 21:08:18 WARN SparkSession$Builder: Using an existing SparkSession; some spark core configurations may not take effect. res3: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9 scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res5: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@7dafb9f9 {code} In the current master: {code} scala> import org.apache.spark.sql.SparkSession import org.apache.spark.sql.SparkSession scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res1: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137 scala> spark.sparkContext.setLogLevel("WARN") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res3: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137 scala> spark.sparkContext.setLogLevel("FATAL") scala> SparkSession.builder.config("spark.abc", "abc").getOrCreate res5: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@3e8a1137 {code} Seems like it works when you set via {{setLogLevel}} initially but cannot be changed afterward. -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org