Done.

https://issues.apache.org/jira/browse/SPARK-8420

Justin

On Wed, Jun 17, 2015 at 4:06 PM, Xiangrui Meng <men...@gmail.com> wrote:

> That sounds like a bug. Could you create a JIRA and ping Yin Huai
> (cc'ed). -Xiangrui
>
> On Wed, May 27, 2015 at 12:57 AM, Justin Yip <yipjus...@prediction.io>
> wrote:
> > Hello,
> >
> > I am trying out 1.4.0 and notice there are some differences in behavior
> with
> > Timestamp between 1.3.1 and 1.4.0.
> >
> > In 1.3.1, I can compare a Timestamp with string.
> > scala> val df = sqlContext.createDataFrame(Seq((1,
> > Timestamp.valueOf("2015-01-01 00:00:00")), (2,
> Timestamp.valueOf("2014-01-01
> > 00:00:00"))))
> > ...
> > scala> df.filter($"_2" <= "2014-06-01").show
> > ...
> > _1 _2
> > 2  2014-01-01 00:00:...
> >
> > However, in 1.4.0, the filter is always false:
> > scala> val df = sqlContext.createDataFrame(Seq((1,
> > Timestamp.valueOf("2015-01-01 00:00:00")), (2,
> Timestamp.valueOf("2014-01-01
> > 00:00:00"))))
> > df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
> >
> > scala> df.filter($"_2" <= "2014-06-01").show
> > +--+--+
> > |_1|_2|
> > +--+--+
> > +--+--+
> >
> > Not sure if that is intended, but I cannot find any doc mentioning these
> > inconsistencies.
> >
> > Thanks.
> >
> > Justin
> >
> > ________________________________
> > View this message in context: Inconsistent behavior with Dataframe
> Timestamp
> > between 1.3.1 and 1.4.0
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to