[ https://issues.apache.org/jira/browse/SPARK-3007?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14095300#comment-14095300 ]
baishuo commented on SPARK-3007: -------------------------------- after modify the code, I can run the hiveql with dynamic partition by SparkSqlCLIDriver: spark-sql> insert overwrite table partition_test_spark partition(stat_date,province) select member_id2,name2,stat_date2,province2 from partition_test_input_spark2; spark-sql>Time taken: 10.351 seconds spark-sql>select * from partition_test_spark; 1 11 date1 pr1 2 22 date1 pr1 3 33 date1 pr2 4 44 date1 pr2 5 55 date2 pr1 6 66 date2 pr1 7 77 date2 pr2 8 88 date2 pr2 spark-sql> Time taken: 0.287 seconds spark-sql>insert overwrite table partition_test_spark partition(stat_date='date1',province) select member_id2,name2,province2 from partition_test_input_spark2 where stat_date2='date2'; spark-sql>select * from partition_test_spark; 5 55 date1 pr1 6 66 date1 pr1 7 77 date1 pr2 8 88 date1 pr2 5 55 date2 pr1 6 66 date2 pr1 7 77 date2 pr2 8 88 date2 pr2 and we can also check that data all located in exceped directionary ---------------------------------------------------------------------- the script to create partition_test_input_spark2 and create table partition_test_input_spark2 (member_id2 string, name2 string, stat_date2 string, province2 string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','; LOAD DATA LOCAL INPATH '/root/Desktop/testpartition.txt' OVERWRITE INTO TABLE partition_test_input_spark2; () create table partition_test_spark (member_id string, name string ) partitioned by ( stat_date string, province string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','; > Add "Dynamic Partition" support to Spark Sql hive > --------------------------------------------------- > > Key: SPARK-3007 > URL: https://issues.apache.org/jira/browse/SPARK-3007 > Project: Spark > Issue Type: Improvement > Components: SQL > Reporter: baishuo > -- This message was sent by Atlassian JIRA (v6.2#6252) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org