[ https://issues.apache.org/jira/browse/SPARK-17277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15656567#comment-15656567 ]
Wenchen Fan commented on SPARK-17277: ------------------------------------- a workarond is setting this config at hive side, or putting this config in spark conf before launch your spark application, can you try it out? > Set hive conf failed > -------------------- > > Key: SPARK-17277 > URL: https://issues.apache.org/jira/browse/SPARK-17277 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.0 > Reporter: Weizhong > Priority: Minor > > Now we can't use "SET k=v" to set Hive conf, for example: run below SQL in > spark-sql > {noformat} > set hive.exec.max.dynamic.partitions = 2000 > {noformat} > but actually the value is 1000(default value), this is because after merge > SPARK-15012, we don't call runSqlHive("SET k=v") when set Hive conf. > Only those conf we will use directly on Spark is OK, like > hive.exec.dynamic.partition.mode etc. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org