[ 
https://issues.apache.org/jira/browse/SPARK-27943?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16856595#comment-16856595
 ] 

Apache Spark commented on SPARK-27943:
--------------------------------------

User 'beliefer' has created a pull request for this issue:
https://github.com/apache/spark/pull/24372

> Implement default constraint with Column for Hive table
> -------------------------------------------------------
>
>                 Key: SPARK-27943
>                 URL: https://issues.apache.org/jira/browse/SPARK-27943
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.3.0, 2.4.0
>            Reporter: jiaan.geng
>            Priority: Major
>
> Default constraint with column is ANSI standard.
> Hive 3.0+ has supported default constraint 
> ref:https://issues.apache.org/jira/browse/HIVE-18726
> But Spark SQL implement this feature not yet.
> Hive is widely used in production environments and is the standard in the 
> field of big data in fact. But Hive exists many version used in production 
> and the feature between each version are different.
> Spark SQL need to implement default constraint, but there are two points to 
> pay attention to in design:
> One is Spark SQL should reduce coupling with Hive. 
> Another is  default constraint could compatible with different versions of 
> Hive.
> We want to save the metadata of default constraint into properties of Hive 
> table, and then we restore metadata from the properties after client gets 
> newest metadata.
> The implement is the same as other metadata (e.g. 
> partition,bucket,statistics).
> Because default constraint is part of column, so I think could reuse the 
> metadata of StructField. The default constraint will cached by metadata of 
> StructField.
> This is a big work, wo I want to split this work into some sub tasks, as 
> follows:
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to