Hi Mohammed,

Would you mind to share the DDL of the table |x| and the complete stacktrace of the exception you got? A full Spark shell session history would be more than helpful. PR #2084 had been merged in master in Aug, and timestamp type is supported in 1.1.

I tried the following snippets in Spark shell (v1.1), and didn’t observe this issue:

|scala> import org.apache.spark.sql._
import org.apache.spark.sql._

scala> import sc._
import sc._

scala> val sqlContext = new SQLContext(sc)
sqlContext: org.apache.spark.sql.SQLContext = 
org.apache.spark.sql.SQLContext@6c3441c5

scala> import sqlContext._
import sqlContext._

scala> case class Record(a: Int, ts: java.sql.Timestamp)
defined class Record

scala> makeRDD(Seq.empty[Record], 1).registerTempTable("x")

scala> sql("SELECT a FROM x WHERE ts >= '2012-01-01T00:00:00' AND ts <= 
'2012-03-31T23:59:59'")
res1: org.apache.spark.sql.SchemaRDD =
SchemaRDD[3] at RDD at SchemaRDD.scala:103
== Query Plan ==
== Physical Plan ==
Project [a#0]
 ExistingRdd [a#0,ts#1], MapPartitionsRDD[5] at mapPartitions at 
basicOperators.scala:208

scala> res1.collect()
...
res2: Array[org.apache.spark.sql.Row] = Array()
|

Cheng

On 10/9/14 10:26 AM, Mohammed Guller wrote:

Hi –

When I run the following Spark SQL query in Spark-shell ( version 1.1.0) :

val rdd = sqlContext.sql("SELECT a FROM x WHERE ts >= '2012-01-01T00:00:00' AND ts <= '2012-03-31T23:59:59' ")

it gives the following error:

rdd: org.apache.spark.sql.SchemaRDD =

SchemaRDD[294] at RDD at SchemaRDD.scala:103

== Query Plan ==

== Physical Plan ==

java.util.NoSuchElementException: head of empty list

The ts column in the where clause has timestamp data and is of type timestamp. If I replace the string '2012-01-01T00:00:00' in the where clause with its epoch value, then the query works fine.

It looks I have run into an issue described in this pull request: https://github.com/apache/spark/pull/2084

Is that PR not merged in Spark version 1.1.0? Or am I missing something?

Thanks,

Mohammed

Reply via email to