[ 
https://issues.apache.org/jira/browse/SPARK-17302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15475156#comment-15475156
 ] 

Ryan Blue commented on SPARK-17302:
-----------------------------------

In 1.6.x, Spark pulled session config for Hive from a {{HiveConf}}, for example 
[Spark 
respects|https://github.com/apache/spark/blob/v1.6.1/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala#L167]
 {{hive.exec.dynamic.partition.mode}}. This could be set in hive-conf.xml or in 
spark-defaults.conf, though the latter had to use {{spark.hadoop...}} to set 
it. Now that the {{SQLConf}} is used instead of {{HiveConf}}, you lose both of 
those methods of setting Hive configuration variables. Using 
{{spark.conf.set("key", "value")}} works for users, but hive-site.xml defaults 
are no longer respected and an administrator can no longer set default values.

> Cannot set non-Spark SQL session variables in hive-site.xml, 
> spark-defaults.conf, or using --conf
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-17302
>                 URL: https://issues.apache.org/jira/browse/SPARK-17302
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Ryan Blue
>
> When configuration changed for 2.0 to the new SparkSession structure, Spark 
> stopped using Hive's internal HiveConf for session state and now uses 
> HiveSessionState and an associated SQLConf. Now, session options like 
> hive.exec.compress.output and hive.exec.dynamic.partition.mode are pulled 
> from this SQLConf. This doesn't include session properties from hive-site.xml 
> (including hive.exec.compress.output), and no longer contains Spark-specific 
> overrides from spark-defaults.conf that used the spark.hadoop.hive... pattern.
> Also, setting these variables on the command-line no longer works because 
> settings must start with "spark.".
> Is there a recommended way to set Hive session properties?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to