[ 
https://issues.apache.org/jira/browse/SPARK-16092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

marymwu updated SPARK-16092:
----------------------------
    Component/s: SQL

> Spark2.0 take no effect after set hive.exec.dynamic.partition.mode=nonstrict 
> as a global variable in Spark2.0 configuration file while Spark1.6 does
> ----------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16092
>                 URL: https://issues.apache.org/jira/browse/SPARK-16092
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: marymwu
>
> Spark2.0 take no effect after set hive.exec.dynamic.partition.mode=nonstrict 
> as a global variable in Spark2.0 configuration file while Spark1.6 does
> Precondition:
> set hive.exec.dynamic.partition.mode=nonstrict as a global variable in 
> Spark2.0 configuration file
> "<property>
>     <name>hive.exec.dynamic.partition.mode</name>
>     <value>nonstrict</value>
> </property>"
> Testcase:
> "insert overwrite table d_test_tpc_2g_txt.marytest1 partition (dt)  select 
> t.nid, t.price, t.dt from (select nid, price, dt from 
> d_test_tpc_2g_txt.marytest where dt >= '2016-06-20' and dt <= '2016-06-21') t 
> group by t.nid, t.price, t.dt;"
> Result:
> Error: org.apache.spark.SparkException: Dynamic partition strict mode 
> requires at least one static partition column. To turn this off set 
> hive.exec.dynamic.partition.mode=nonstrict (state=,code=0)
> Note:
> Spark1.6 supports the above SQL statement after set 
> hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark 
> configuration file
> "<property>
>     <name>hive.exec.dynamic.partition.mode</name>
>     <value>nonstrict</value>
> </property>"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to