RE: Spark SQL parser bug?

2014-11-25 Thread Leon
Hello I just stumbled on exactly the same issue as you are discussing in this thread. Here are my dependencies: dependencies dependency groupIdcom.datastax.spark/groupId artifactIdspark-cassandra-connector_2.10/artifactId version1.1.0/version

RE: Spark SQL parser bug?

2014-11-25 Thread Mohammed Guller
Leon, I solved the problem by creating a work around for it, so didn't have a need to upgrade to 1.1.2-SNAPSHOT. Mohammed -Original Message- From: Leon [mailto:pachku...@gmail.com] Sent: Tuesday, November 25, 2014 11:36 AM To: u...@spark.incubator.apache.org Subject: RE: Spark SQL

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
From: Cheng, Hao [mailto:hao.ch...@intel.com] Sent: Sunday, October 12, 2014 1:35 AM To: Mohammed Guller; Cheng Lian; user@spark.apache.org Subject: RE: Spark SQL parser bug? Hi, I couldn’t reproduce the bug with the latest master branch. Which version are you using? Can you also list data

Re: Spark SQL parser bug?

2014-10-13 Thread Yin Huai
Lian; user@spark.apache.org *Subject:* RE: Spark SQL parser bug? Hi, I couldn’t reproduce the bug with the latest master branch. Which version are you using? Can you also list data in the table “x”? case class T(a:String, ts:java.sql.Timestamp) val sqlContext = new

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
: Cheng, Hao; Cheng Lian; user@spark.apache.org Subject: Re: Spark SQL parser bug? Seems the reason that you got wrong results was caused by timezone. The time in java.sql.Timestamp(long time) means milliseconds since January 1, 1970, 00:00:00 GMT. A negative number is the number of milliseconds

Re: Spark SQL parser bug?

2014-10-13 Thread Yin Huai
; Cheng Lian; user@spark.apache.org *Subject:* Re: Spark SQL parser bug? Seems the reason that you got wrong results was caused by timezone. The time in java.sql.Timestamp(long time) means milliseconds since January 1, 1970, 00:00:00 *GMT*. A negative number is the number of milliseconds

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
That explains it. Thanks! Mohammed From: Yin Huai [mailto:huaiyin@gmail.com] Sent: Monday, October 13, 2014 8:47 AM To: Mohammed Guller Cc: Cheng, Hao; Cheng Lian; user@spark.apache.org Subject: Re: Spark SQL parser bug? Yeah, it is not related to timezone. I think you hit this issuehttps

RE: Spark SQL parser bug?

2014-10-12 Thread Cheng, Hao
...@glassbeam.com] Sent: Sunday, October 12, 2014 12:06 AM To: Cheng Lian; user@spark.apache.org Subject: RE: Spark SQL parser bug? I tried even without the “T” and it still returns an empty result: scala val sRdd = sqlContext.sql(select a from x where ts = '2012-01-01 00:00:00';) sRdd

RE: Spark SQL parser bug?

2014-10-11 Thread Mohammed Guller
: Re: Spark SQL parser bug? Hmm, there is a “T” in the timestamp string, which makes the string not a valid timestamp string representation. Internally Spark SQL uses java.sql.Timestamp.valueOf to cast a string to a timestamp. On 10/11/14 2:08 AM, Mohammed Guller wrote: scala

Re: Spark SQL parser bug?

2014-10-10 Thread Cheng Lian
Hi Mohammed, Would you mind to share the DDL of the table |x| and the complete stacktrace of the exception you got? A full Spark shell session history would be more than helpful. PR #2084 had been merged in master in Aug, and timestamp type is supported in 1.1. I tried the following

RE: Spark SQL parser bug?

2014-10-10 Thread Mohammed Guller
, 2014 4:37 AM To: Mohammed Guller; user@spark.apache.org Subject: Re: Spark SQL parser bug? Hi Mohammed, Would you mind to share the DDL of the table x and the complete stacktrace of the exception you got? A full Spark shell session history would be more than helpful. PR #2084 had been merged

Re: Spark SQL parser bug?

2014-10-10 Thread Cheng Lian
Hmm, there is a “T” in the timestamp string, which makes the string not a valid timestamp string representation. Internally Spark SQL uses |java.sql.Timestamp.valueOf| to cast a string to a timestamp. On 10/11/14 2:08 AM, Mohammed Guller wrote: scala rdd.registerTempTable(x) scala val sRdd