[ https://issues.apache.org/jira/browse/SPARK-13666?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-13666. ------------------------------------ Resolution: Duplicate SPARK-15012 removed the warning altogether. > Annoying warnings from SQLConf in log output > -------------------------------------------- > > Key: SPARK-13666 > URL: https://issues.apache.org/jira/browse/SPARK-13666 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Marcelo Vanzin > Priority: Minor > > Whenever I run spark-shell I get a bunch of warnings about SQL configuration: > {noformat} > 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set > non-Spark SQL config in SQLConf: key = spark.yarn.driver.memoryOverhead, > value = 26 > 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set > non-Spark SQL config in SQLConf: key = spark.yarn.executor.memoryOverhead, > value = 26 > 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set > non-Spark SQL config in SQLConf: key = spark.executor.cores, value = 1 > 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set > non-Spark SQL config in SQLConf: key = spark.executor.memory, value = > 268435456 > {noformat} > That should't happen, since I'm not setting those values explicitly. They're > either set internally by Spark or come from spark-defaults.conf. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org