That sounds like a bug. Could you create a JIRA and ping Yin Huai
(cc'ed). -Xiangrui

On Wed, May 27, 2015 at 12:57 AM, Justin Yip <yipjus...@prediction.io> wrote:
> Hello,
>
> I am trying out 1.4.0 and notice there are some differences in behavior with
> Timestamp between 1.3.1 and 1.4.0.
>
> In 1.3.1, I can compare a Timestamp with string.
> scala> val df = sqlContext.createDataFrame(Seq((1,
> Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01
> 00:00:00"))))
> ...
> scala> df.filter($"_2" <= "2014-06-01").show
> ...
> _1 _2
> 2  2014-01-01 00:00:...
>
> However, in 1.4.0, the filter is always false:
> scala> val df = sqlContext.createDataFrame(Seq((1,
> Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01
> 00:00:00"))))
> df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
>
> scala> df.filter($"_2" <= "2014-06-01").show
> +--+--+
> |_1|_2|
> +--+--+
> +--+--+
>
> Not sure if that is intended, but I cannot find any doc mentioning these
> inconsistencies.
>
> Thanks.
>
> Justin
>
> ________________________________
> View this message in context: Inconsistent behavior with Dataframe Timestamp
> between 1.3.1 and 1.4.0
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to