[ 
https://issues.apache.org/jira/browse/SPARK-13341?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15154694#comment-15154694
 ] 

Srinivasa Reddy Vundela commented on SPARK-13341:
-------------------------------------------------

I guess the following commit is the reason for the change
https://github.com/apache/spark/commit/9ed4ad4265cf9d3135307eb62dae6de0b220fc21

Seems HIVE-3454 fixed in 1.2.0 and if customers are using earlier versions of 
HIVE they will see this problem.

> Casting Unix timestamp to SQL timestamp fails
> ---------------------------------------------
>
>                 Key: SPARK-13341
>                 URL: https://issues.apache.org/jira/browse/SPARK-13341
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: William Dee
>
> The way that unix timestamp casting is handled has been broken between Spark 
> 1.5.2 and Spark 1.6.0. This can be easily demonstrated via the spark-shell:
> {code:title=1.5.2}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |2016-02-16 00:00:...|2016-02-16|
> +--------------------+----------+
> {code}
> {code:title=1.6.0}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |48095-07-09 12:06...|095-07-09|
> +--------------------+----------+
> {code}
> I'm not sure what exactly is causing this but this defect has definitely been 
> introduced in Spark 1.6.0 as jobs that relied on this functionality ran on 
> 1.5.2 and now don't run on 1.6.0.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to