[ 
https://issues.apache.org/jira/browse/SPARK-37692?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

JacobZheng updated SPARK-37692:
-------------------------------
    Description: 
Description in the documentation:
{code:java}
In Spark 3.2, the unit list interval literals can not mix year-month fields 
(YEAR and MONTH) and day-time fields (WEEK, DAY, …, MICROSECOND). For example, 
INTERVAL 1 day 1 hour is invalid in Spark 3.2. In Spark 3.1 and earlier, there 
is no such limitation and the literal returns value of CalendarIntervalType. To 
restore the behavior before Spark 3.2, you can set 
spark.sql.legacy.interval.enabled to true. {code}
”INTERVAL 1 day 1 hour is invalid in Spark 3.2.“ 

Is this example correct? According to the description of DayTimeIntervalType, 
INTERVAL 1 day 1 hour is valid in Spark 3.2

  was:
Description in the documentation:
{code:java}
//代码占位符
In Spark 3.2, the unit list interval literals can not mix year-month fields 
(YEAR and MONTH) and day-time fields (WEEK, DAY, …, MICROSECOND). For example, 
INTERVAL 1 day 1 hour is invalid in Spark 3.2. In Spark 3.1 and earlier, there 
is no such limitation and the literal returns value of CalendarIntervalType. To 
restore the behavior before Spark 3.2, you can set 
spark.sql.legacy.interval.enabled to true. {code}
”INTERVAL 1 day 1 hour is invalid in Spark 3.2.“ 

Is this example correct? According to the description of DayTimeIntervalType, 
INTERVAL 1 day 1 hour is valid in Spark 3.2


> sql-migration-guide wrong description
> -------------------------------------
>
>                 Key: SPARK-37692
>                 URL: https://issues.apache.org/jira/browse/SPARK-37692
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 3.2.0
>            Reporter: JacobZheng
>            Priority: Trivial
>
> Description in the documentation:
> {code:java}
> In Spark 3.2, the unit list interval literals can not mix year-month fields 
> (YEAR and MONTH) and day-time fields (WEEK, DAY, …, MICROSECOND). For 
> example, INTERVAL 1 day 1 hour is invalid in Spark 3.2. In Spark 3.1 and 
> earlier, there is no such limitation and the literal returns value of 
> CalendarIntervalType. To restore the behavior before Spark 3.2, you can set 
> spark.sql.legacy.interval.enabled to true. {code}
> ”INTERVAL 1 day 1 hour is invalid in Spark 3.2.“ 
> Is this example correct? According to the description of DayTimeIntervalType, 
> INTERVAL 1 day 1 hour is valid in Spark 3.2



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to