[ 
https://issues.apache.org/jira/browse/SPARK-13341?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15171364#comment-15171364
 ] 

Linbo edited comment on SPARK-13341 at 2/29/16 3:33 AM:
--------------------------------------------------------

Inside org.apache.spark.sql.catalyst.expressions.Cast, the input long value is 
assumed to seconds, but you gave the milliseconds. 

https://github.com/apache/spark/blob/c5e7076da72657ea35a0aa388f8d2e6411d39280/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L214-L219

```
# Cast.scala (spark 1.6.0)

// converting seconds to us
private[this] def longToTimestamp(t: Long): Long = t * 1000000L
```




was (Author: linbojin):
Inside org.apache.spark.sql.catalyst.expressions.Cast, the input long value is 
assumed to seconds, but you gave the milliseconds. 

https://github.com/apache/spark/blob/c5e7076da72657ea35a0aa388f8d2e6411d39280/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L214-L219
```
# Cast.scala (spark 1.6.0)

// converting seconds to us
private[this] def longToTimestamp(t: Long): Long = t * 1000000L
```



> Casting Unix timestamp to SQL timestamp fails
> ---------------------------------------------
>
>                 Key: SPARK-13341
>                 URL: https://issues.apache.org/jira/browse/SPARK-13341
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: William Dee
>
> The way that unix timestamp casting is handled has been broken between Spark 
> 1.5.2 and Spark 1.6.0. This can be easily demonstrated via the spark-shell:
> {code:title=1.5.2}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |2016-02-16 00:00:...|2016-02-16|
> +--------------------+----------+
> {code}
> {code:title=1.6.0}
> scala> sqlContext.sql("SELECT CAST(1455580840000 AS TIMESTAMP) as ts, 
> CAST(CAST(1455580840000 AS TIMESTAMP) AS DATE) as d").show
> +--------------------+----------+
> |                  ts|         d|
> +--------------------+----------+
> |48095-07-09 12:06...|095-07-09|
> +--------------------+----------+
> {code}
> I'm not sure what exactly is causing this but this defect has definitely been 
> introduced in Spark 1.6.0 as jobs that relied on this functionality ran on 
> 1.5.2 and now don't run on 1.6.0.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to