[ 
https://issues.apache.org/jira/browse/SPARK-30688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17031614#comment-17031614
 ] 

Attila Zsolt Piros commented on SPARK-30688:
--------------------------------------------

I have checked on Spark 3.0.0-preview2 and week in year fails there too:

{noformat}
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-preview2
      /_/

Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_202)
Type in expressions to have them evaluated.
Type :help for more information.

scala> spark.sql("select unix_timestamp('20201', 'yyyyww')").show();
+-----------------------------+
|unix_timestamp(20201, yyyyww)|
+-----------------------------+
|                         null|
+-----------------------------+
{noformat}

As you can see it fails for even a simpler pattern too:

{noformat}
scala> spark.sql("select unix_timestamp('2020-10', 'yyyy-ww')").show();
+--------------------------------+
|unix_timestamp(2020-10, yyyy-ww)|
+--------------------------------+
|                            null|
+--------------------------------+
{noformat}

But this strangely works:

{noformat}
scala> spark.sql("select unix_timestamp('2020-01', 'yyyy-ww')").show();
+--------------------------------+
|unix_timestamp(2020-01, yyyy-ww)|
+--------------------------------+
|                      1577833200|
+--------------------------------+
{noformat}


> Spark SQL Unix Timestamp produces incorrect result with unix_timestamp UDF
> --------------------------------------------------------------------------
>
>                 Key: SPARK-30688
>                 URL: https://issues.apache.org/jira/browse/SPARK-30688
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.2
>            Reporter: Rajkumar Singh
>            Priority: Major
>
>  
> {code:java}
> scala> spark.sql("select unix_timestamp('20201', 'yyyyww')").show();
> +-----------------------------+
> |unix_timestamp(20201, yyyyww)|
> +-----------------------------+
> |                         null|
> +-----------------------------+
>  
> scala> spark.sql("select unix_timestamp('20202', 'yyyyww')").show();
> -----------------------------+
> |unix_timestamp(20202, yyyyww)|
> +-----------------------------+
> |                   1578182400|
> +-----------------------------+
>  
> {code}
>  
>  
> This seems to happen for leap year only, I dig deeper into it and it seems 
> that  Spark is using the java.text.SimpleDateFormat and try to parse the 
> expression here
> [org.apache.spark.sql.catalyst.expressions.UnixTime#eval|https://github.com/hortonworks/spark2/blob/49ec35bbb40ec6220282d932c9411773228725be/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala#L652]
> {code:java}
> formatter.parse(
>  t.asInstanceOf[UTF8String].toString).getTime / 1000L{code}
>  but fail and SimpleDateFormat unable to parse the date throw Unparseable 
> Exception but Spark handle it silently and returns NULL.
>  
> *Spark-3.0:* I did some tests where spark no longer using the legacy 
> java.text.SimpleDateFormat but java date/time API, it seems  date/time API 
> expect a valid date with valid format
>  org.apache.spark.sql.catalyst.util.Iso8601TimestampFormatter#parse



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to