RE: Spark SQL parser bug?

2014-11-25 Thread Leon
? Thanks, Leon -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe

RE: Spark SQL parser bug?

2014-11-25 Thread Mohammed Guller
update? Thanks, Leon -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html Sent from the Apache Spark User List mailing list archive at Nabble.com

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
From: Cheng, Hao [mailto:hao.ch...@intel.com] Sent: Sunday, October 12, 2014 1:35 AM To: Mohammed Guller; Cheng Lian; user@spark.apache.org Subject: RE: Spark SQL parser bug? Hi, I couldn’t reproduce the bug with the latest master branch. Which version are you using? Can you also list data

Re: Spark SQL parser bug?

2014-10-13 Thread Yin Huai
Lian; user@spark.apache.org *Subject:* RE: Spark SQL parser bug? Hi, I couldn’t reproduce the bug with the latest master branch. Which version are you using? Can you also list data in the table “x”? case class T(a:String, ts:java.sql.Timestamp) val sqlContext = new

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
: Cheng, Hao; Cheng Lian; user@spark.apache.org Subject: Re: Spark SQL parser bug? Seems the reason that you got wrong results was caused by timezone. The time in java.sql.Timestamp(long time) means milliseconds since January 1, 1970, 00:00:00 GMT. A negative number is the number of milliseconds

Re: Spark SQL parser bug?

2014-10-13 Thread Yin Huai
; Cheng Lian; user@spark.apache.org *Subject:* Re: Spark SQL parser bug? Seems the reason that you got wrong results was caused by timezone. The time in java.sql.Timestamp(long time) means milliseconds since January 1, 1970, 00:00:00 *GMT*. A negative number is the number of milliseconds

RE: Spark SQL parser bug?

2014-10-13 Thread Mohammed Guller
That explains it. Thanks! Mohammed From: Yin Huai [mailto:huaiyin@gmail.com] Sent: Monday, October 13, 2014 8:47 AM To: Mohammed Guller Cc: Cheng, Hao; Cheng Lian; user@spark.apache.org Subject: Re: Spark SQL parser bug? Yeah, it is not related to timezone. I think you hit this issuehttps

RE: Spark SQL parser bug?

2014-10-12 Thread Cheng, Hao
...@glassbeam.com] Sent: Sunday, October 12, 2014 12:06 AM To: Cheng Lian; user@spark.apache.org Subject: RE: Spark SQL parser bug? I tried even without the “T” and it still returns an empty result: scala val sRdd = sqlContext.sql(select a from x where ts = '2012-01-01 00:00:00';) sRdd

RE: Spark SQL parser bug?

2014-10-11 Thread Mohammed Guller
: Re: Spark SQL parser bug? Hmm, there is a “T” in the timestamp string, which makes the string not a valid timestamp string representation. Internally Spark SQL uses java.sql.Timestamp.valueOf to cast a string to a timestamp. On 10/11/14 2:08 AM, Mohammed Guller wrote: scala

Re: Spark SQL parser bug?

2014-10-10 Thread Cheng Lian
Hi Mohammed, Would you mind to share the DDL of the table |x| and the complete stacktrace of the exception you got? A full Spark shell session history would be more than helpful. PR #2084 had been merged in master in Aug, and timestamp type is supported in 1.1. I tried the following

RE: Spark SQL parser bug?

2014-10-10 Thread Mohammed Guller
, 2014 4:37 AM To: Mohammed Guller; user@spark.apache.org Subject: Re: Spark SQL parser bug? Hi Mohammed, Would you mind to share the DDL of the table x and the complete stacktrace of the exception you got? A full Spark shell session history would be more than helpful. PR #2084 had been merged

Re: Spark SQL parser bug?

2014-10-10 Thread Cheng Lian
Hmm, there is a “T” in the timestamp string, which makes the string not a valid timestamp string representation. Internally Spark SQL uses |java.sql.Timestamp.valueOf| to cast a string to a timestamp. On 10/11/14 2:08 AM, Mohammed Guller wrote: scala rdd.registerTempTable(x) scala val sRdd

Spark SQL parser bug?

2014-10-08 Thread Mohammed Guller
Hi - When I run the following Spark SQL query in Spark-shell ( version 1.1.0) : val rdd = sqlContext.sql(SELECT a FROM x WHERE ts = '2012-01-01T00:00:00' AND ts = '2012-03-31T23:59:59' ) it gives the following error: rdd: org.apache.spark.sql.SchemaRDD = SchemaRDD[294] at RDD at