?
Thanks,
Leon
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe
update?
Thanks,
Leon
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-parser-bug-tp15999p19793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
From: Cheng, Hao [mailto:hao.ch...@intel.com]
Sent: Sunday, October 12, 2014 1:35 AM
To: Mohammed Guller; Cheng Lian; user@spark.apache.org
Subject: RE: Spark SQL parser bug?
Hi, I couldn’t reproduce the bug with the latest master branch. Which version
are you using? Can you also list data
Lian; user@spark.apache.org
*Subject:* RE: Spark SQL parser bug?
Hi, I couldn’t reproduce the bug with the latest master branch. Which
version are you using? Can you also list data in the table “x”?
case class T(a:String, ts:java.sql.Timestamp)
val sqlContext = new
: Cheng, Hao; Cheng Lian; user@spark.apache.org
Subject: Re: Spark SQL parser bug?
Seems the reason that you got wrong results was caused by timezone.
The time in java.sql.Timestamp(long time) means milliseconds since January 1,
1970, 00:00:00 GMT. A negative number is the number of milliseconds
; Cheng Lian; user@spark.apache.org
*Subject:* Re: Spark SQL parser bug?
Seems the reason that you got wrong results was caused by timezone.
The time in java.sql.Timestamp(long time) means milliseconds since
January 1, 1970, 00:00:00 *GMT*. A negative number is the number of
milliseconds
That explains it. Thanks!
Mohammed
From: Yin Huai [mailto:huaiyin@gmail.com]
Sent: Monday, October 13, 2014 8:47 AM
To: Mohammed Guller
Cc: Cheng, Hao; Cheng Lian; user@spark.apache.org
Subject: Re: Spark SQL parser bug?
Yeah, it is not related to timezone. I think you hit this
issuehttps
...@glassbeam.com]
Sent: Sunday, October 12, 2014 12:06 AM
To: Cheng Lian; user@spark.apache.org
Subject: RE: Spark SQL parser bug?
I tried even without the “T” and it still returns an empty result:
scala val sRdd = sqlContext.sql(select a from x where ts = '2012-01-01
00:00:00';)
sRdd
: Re: Spark SQL parser bug?
Hmm, there is a “T” in the timestamp string, which makes the string not a valid
timestamp string representation. Internally Spark SQL uses
java.sql.Timestamp.valueOf to cast a string to a timestamp.
On 10/11/14 2:08 AM, Mohammed Guller wrote:
scala
Hi Mohammed,
Would you mind to share the DDL of the table |x| and the complete
stacktrace of the exception you got? A full Spark shell session history
would be more than helpful. PR #2084 had been merged in master in Aug,
and timestamp type is supported in 1.1.
I tried the following
, 2014 4:37 AM
To: Mohammed Guller; user@spark.apache.org
Subject: Re: Spark SQL parser bug?
Hi Mohammed,
Would you mind to share the DDL of the table x and the complete stacktrace of
the exception you got? A full Spark shell session history would be more than
helpful. PR #2084 had been merged
Hmm, there is a “T” in the timestamp string, which makes the string not
a valid timestamp string representation. Internally Spark SQL uses
|java.sql.Timestamp.valueOf| to cast a string to a timestamp.
On 10/11/14 2:08 AM, Mohammed Guller wrote:
scala rdd.registerTempTable(x)
scala val sRdd
Hi -
When I run the following Spark SQL query in Spark-shell ( version 1.1.0) :
val rdd = sqlContext.sql(SELECT a FROM x WHERE ts = '2012-01-01T00:00:00' AND
ts = '2012-03-31T23:59:59' )
it gives the following error:
rdd: org.apache.spark.sql.SchemaRDD =
SchemaRDD[294] at RDD at
13 matches
Mail list logo