Hi Yin
With HiveContext works well.
Thanks!!!
Regars.
Miguel Angel.
On Fri, Mar 13, 2015 at 3:18 PM, Yin Huai yh...@databricks.com wrote:
Are you using SQLContext? Right now, the parser in the SQLContext is quite
limited on the data type keywords that it handles (see here
Hi.
I have a query in Spark SQL and I can not covert a value to BIGINT:
CAST(column AS BIGINT) or
CAST(0 AS BIGINT)
The output is:
Exception in thread main java.lang.RuntimeException: [34.62] failure:
``DECIMAL'' expected but identifier BIGINT found
Thanks!!
Regards.
Miguel Ángel
Are you using SQLContext? Right now, the parser in the SQLContext is quite
limited on the data type keywords that it handles (see here
https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SqlParser.scala#L391)
and unfortunately bigint is not handled