[ 
https://issues.apache.org/jira/browse/HUDI-1374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

liwei updated HUDI-1374:
------------------------
    Summary: spark sql for hudi support dynamic partitioning  (was: spark sql 
for hudi support Static partition mode)

> spark sql for hudi support dynamic partitioning
> -----------------------------------------------
>
>                 Key: HUDI-1374
>                 URL: https://issues.apache.org/jira/browse/HUDI-1374
>             Project: Apache Hudi
>          Issue Type: Sub-task
>          Components: Spark Integration
>            Reporter: liwei
>            Assignee: liwei
>            Priority: Major
>
> disscuss in https://github.com/apache/hudi/pull/2196
> A. Thanks so much. This pr need to solved the issue with better approach. 
> Now I am more clear about overwrite semantic between table.overwrite and 
> spark sql overwrite for hudi.
> B. Also spark sql for hudi overwrite should have the ability just like spark 
> sql 、hive 、 delta lake.
> these engine have three mode for overwrite about partition:
> 1. Dynamic Partition : delete all partition data ,and the insert the new data 
> for different 
> 2. Static partition: just overwrite the partition which is user specified
> 3. Mixed partition: mixed of 1 and 2
> more detail in : 
> https://spark.apache.org/docs/3.0.0-preview/sql-ref-syntax-dml-insert-overwrite-table.html
> https://www.programmersought.com/article/47155360487/
> C. our plan
> 1. Now spark sql for hudi overwrite is Dynamic Partition. I will resolved it 
> in this issue HUDI-1349, first support delete all partition in HUDI-1350, 
> then land HUDI-1349
> 2. Now spark sql for hudi does not support "Static partition" mode, will then 
> land it in



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to