[ 
https://issues.apache.org/jira/browse/SPARK-51208?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-51208:
----------------------------------
        Parent: SPARK-44111
    Issue Type: Sub-task  (was: Bug)

> ColumnDefinition.toV1Column should preserve EXISTS_DEFAULT resolution
> ---------------------------------------------------------------------
>
>                 Key: SPARK-51208
>                 URL: https://issues.apache.org/jira/browse/SPARK-51208
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 4.0.0
>            Reporter: Szehon Ho
>            Priority: Major
>              Labels: pull-request-available
>
> Some external catalogs use toV1Column and fromV1Column.  However, we noticed 
> that toV1Column will un-resolve some already-resolved EXISTS default values.
> Long Description: When user creates a column with default value, Spark will 
> resolve the user-provided CURRENT_DEFAULT sql, to make an EXISTS_DEFAULT 
> value.  That is done to save the values at that time, for example to resolve 
> current_database(), current_user() functions.  Existing data will get those 
> values. 
> The typical workflow is:
> {code:java}
> val existsSQL = ResolveDefaultColumns.analyze(col, type, defaultSQL)
> // save both existsSQL and defaultSQL to column metadata{code}
>  
> But this code problmatically existsDefault metadata is set back to the 
> original defaultSQL.
> Consuming code will get corrupted un-resolved expressions in EXISTS_DEFAULT, 
> like current_user().  This means that later when these columns are read, 
> these may need to be resolved again, potentially leading to data that changes 
> value.   
>  
> While Spark code base currently does not have such catalogs, there are some 
> extneral catalogs out there that are hitting this issue. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to