[ 
https://issues.apache.org/jira/browse/SPARK-49639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mihailo Milosevic updated SPARK-49639:
--------------------------------------
    Description: INVALID_INTERVAL_WITH_MICROSECONDS_ADDITION contains suggested 
fix for turning off ANSI mode. Now that in Spark 4.0.0 we have moved to ANSI 
mode on by default, we want to keep suggestions of this kind to the minimum. 
There exist implementations of `try*` functions which provide safe way to get 
behaviour as for ANSI mode off and suggestions of this kind should be 
sufficient.  (was: DATETIME_FIELD_OUT_OF_BOUNDS contains suggested fix for 
turning off ANSI mode. Now that in Spark 4.0.0 we have moved to ANSI mode on by 
default, we want to keep suggestions of this kind to the minimum. There exist 
implementations of `try*` functions which provide safe way to get behaviour as 
for ANSI mode off and suggestions of this kind should be sufficient.)

> Remove the ANSI config suggestion in 
> INVALID_INTERVAL_WITH_MICROSECONDS_ADDITION
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-49639
>                 URL: https://issues.apache.org/jira/browse/SPARK-49639
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 4.0.0
>            Reporter: Mihailo Milosevic
>            Priority: Major
>              Labels: starter
>
> INVALID_INTERVAL_WITH_MICROSECONDS_ADDITION contains suggested fix for 
> turning off ANSI mode. Now that in Spark 4.0.0 we have moved to ANSI mode on 
> by default, we want to keep suggestions of this kind to the minimum. There 
> exist implementations of `try*` functions which provide safe way to get 
> behaviour as for ANSI mode off and suggestions of this kind should be 
> sufficient.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to