I tried even without the “T” and it still returns an empty result:

scala> val sRdd = sqlContext.sql("select a from x where ts >= '2012-01-01 
00:00:00';")
sRdd: org.apache.spark.sql.SchemaRDD =
SchemaRDD[35] at RDD at SchemaRDD.scala:103
== Query Plan ==
== Physical Plan ==
Project [a#0]
ExistingRdd [a#0,ts#1], MapPartitionsRDD[37] at mapPartitions at 
basicOperators.scala:208

scala> sRdd.collect
res10: Array[org.apache.spark.sql.Row] = Array()


Mohammed

From: Cheng Lian [mailto:lian.cs....@gmail.com]
Sent: Friday, October 10, 2014 10:14 PM
To: Mohammed Guller; user@spark.apache.org
Subject: Re: Spark SQL parser bug?


Hmm, there is a “T” in the timestamp string, which makes the string not a valid 
timestamp string representation. Internally Spark SQL uses 
java.sql.Timestamp.valueOf to cast a string to a timestamp.

On 10/11/14 2:08 AM, Mohammed Guller wrote:
scala> rdd.registerTempTable("x")

scala> val sRdd = sqlContext.sql("select a from x where ts >= 
'2012-01-01T00:00:00';")
sRdd: org.apache.spark.sql.SchemaRDD =
SchemaRDD[4] at RDD at SchemaRDD.scala:103
== Query Plan ==
== Physical Plan ==
Project [a#0]
ExistingRdd [a#0,ts#1], MapPartitionsRDD[6] at mapPartitions at 
basicOperators.scala:208

scala> sRdd.collect
res2: Array[org.apache.spark.sql.Row] = Array()
​

Reply via email to