These two are just not equivalent.

Spark SQL interprets long as seconds when casting between timestamps and
numerics, therefore
lit(1485503350000L).cast(org.apache.spark.sql.types.TimestampType)
represents 49043-09-23 21:26:400.0. This behavior is intended - see for
example https://issues.apache.org/jira/browse/SPARK-11724

java.sql.Timestamp expects milliseconds as an argument therefore lit(new
java.sql.Timestamp(1485503350000L)) represents 2017-01-27 08:49:10
.

On 15 August 2017 at 13:16, assaf.mendelson <assaf.mendel...@rsa.com> wrote:

> Hi all,
>
> I encountered weird behavior for timestamp. It seems that when using lit
> to add it to column, the timestamp goes from milliseconds representation to
> seconds representation:
>
>
>
>
>
> scala> spark.range(1).withColumn("a", lit(new java.sql.Timestamp(
> 1485503350000L)).cast("long")).show()
>
> +---+----------+
>
> | id|         a|
>
> +---+----------+
>
> |  0|1485503350|
>
> +---+----------+
>
>
>
>
>
> scala> spark.range(1).withColumn("a", lit(1485503350000L).cast(org.
> apache.spark.sql.types.TimestampType).cast(org.apache.spark.sql.types.
> LongType)).show()
>
> +---+-------------+
>
> | id|            a|
>
> +---+-------------+
>
> |  0|1485503350000|
>
> +---+-------------+
>
>
>
>
>
> Is this a bug or am I missing something here?
>
>
>
> Thanks,
>
>         Assaf
>
>
>
> ------------------------------
> View this message in context: Possible bug: inconsistent timestamp
> behavior
> <http://apache-spark-developers-list.1001551.n3.nabble.com/Possible-bug-inconsistent-timestamp-behavior-tp22144.html>
> Sent from the Apache Spark Developers List mailing list archive
> <http://apache-spark-developers-list.1001551.n3.nabble.com/> at
> Nabble.com.
>



-- 

Z poważaniem,
Maciej Szymkiewicz

Reply via email to