[ 
https://issues.apache.org/jira/browse/SPARK-13341?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15171300#comment-15171300
 ] 

Xiao Li commented on SPARK-13341:
---------------------------------

Yeah. To get a correct answer, you just need to remove the tailing three zeros. 
For example, 
{code}
    sql("SELECT CAST(1455580840 AS TIMESTAMP) as ts, CAST(CAST(1455580840 AS 
TIMESTAMP) AS DATE) as d").show()
{code}


> Casting Unix timestamp to SQL timestamp fails
> ---------------------------------------------
>
>                 Key: SPARK-13341
>                 URL: https://issues.apache.org/jira/browse/SPARK-13341
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: William Dee
>
> The way that unix timestamp casting is handled has been broken between Spark 
> 1.5.2 and Spark 1.6.0. This can be easily demonstrated via the spark-shell:
> {code:title=1.5.2}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |2016-02-16 00:00:...|2016-02-16|
> +--------------------+----------+
> {code}
> {code:title=1.6.0}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |48095-07-09 12:06...|095-07-09|
> +--------------------+----------+
> {code}
> I'm not sure what exactly is causing this but this defect has definitely been 
> introduced in Spark 1.6.0 as jobs that relied on this functionality ran on 
> 1.5.2 and now don't run on 1.6.0.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to