[ https://issues.apache.org/jira/browse/SPARK-17277?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Weizhong updated SPARK-17277: ----------------------------- Description: Now we can't use "SET k=v" to set Hive conf, for example: run below SQL in spark-sql {noformat} set hive.exec.max.dynamic.partitions = 2000 {noformat} but actually the value is 1000(default value), this is because after merge SPARK-15012, we don't call runSqlHive("SET k=v") when set Hive conf. was:Now we can use "SET k=v" to set Hive conf, for example: > Set hive conf failed > -------------------- > > Key: SPARK-17277 > URL: https://issues.apache.org/jira/browse/SPARK-17277 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Weizhong > Priority: Minor > > Now we can't use "SET k=v" to set Hive conf, for example: run below SQL in > spark-sql > {noformat} > set hive.exec.max.dynamic.partitions = 2000 > {noformat} > but actually the value is 1000(default value), this is because after merge > SPARK-15012, we don't call runSqlHive("SET k=v") when set Hive conf. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org