[ 
https://issues.apache.org/jira/browse/SPARK-36970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Max Gekk reassigned SPARK-36970:
--------------------------------

    Assignee: Yang Jie

> Manual disabled format `B` for `date_format` function to compatibility with 
> Java 8 behavior.
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-36970
>                 URL: https://issues.apache.org/jira/browse/SPARK-36970
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Yang Jie
>            Assignee: Yang Jie
>            Priority: Major
>
> The `date_format` function has some behavioral differences when using JDK 8 
> and JDK 17 as following:
> the result of {{select date_format('2018-11-17 13:33:33.333', 'B')}} in 
> {{datetime-formatting-invalid.sql}} with Java 8 is:
> {code:java}
> -- !query
> select date_format('2018-11-17 13:33:33.333', 'B')
> -- !query schema
> struct<>
> -- !query output
> java.lang.IllegalArgumentException
> Unknown pattern letter: B
> {code}
> and with Java 17 the result is:
> {code:java}
> - datetime-formatting-invalid.sql *** FAILED ***
>   datetime-formatting-invalid.sql
>   Expected "struct<[]>", but got "struct<[date_format(2018-11-17 
> 13:33:33.333, B):string]>" Schema did not match for query #34
>   select date_format('2018-11-17 13:33:33.333', 'B'): -- !query
>   select date_format('2018-11-17 13:33:33.333', 'B')
>   -- !query schema
>   struct<date_format(2018-11-17 13:33:33.333, B):string>
>   -- !query output
>   in the afternoon (SQLQueryTestSuite.scala:469)
> {code}
>  
> From the javadoc we can find that 'B' is used to represent `{{Pattern letters 
> to output a day period`}} in Java 17.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to