actually i tried in spark shell , got same error and then for some reason i
tried to back tick the "timestamp" and it worked.
 val result = sqlContext.sql("select toSeconds(`timestamp`) as t,
count(rid) as qps from blah group by toSeconds(`timestamp`),qi.clientName")

so, it seems sql context is supporting UDF.



On Tue, Feb 10, 2015 at 2:32 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> The simple SQL parser doesn't yet support UDFs.  Try using a HiveContext.
>
> On Tue, Feb 10, 2015 at 1:44 PM, Mohnish Kodnani <
> mohnish.kodn...@gmail.com> wrote:
>
>> Hi,
>> I am trying a very simple registerFunction and it is giving me errors.
>>
>> I have a parquet file which I register as temp table.
>> Then I define a UDF.
>>
>> def toSeconds(timestamp: Long): Long = timestamp/100000
>>
>> sqlContext.registerFunction("toSeconds", toSeconds _)
>>
>> val result = sqlContext.sql("select toSeconds(timestamp) from blah");
>> I get the following error.
>> java.lang.RuntimeException: [1.18] failure: ``)'' expected but
>> `timestamp' found
>>
>> select toSeconds(timestamp) from blah
>>
>> My end goal is as follows:
>> We have log file with timestamps in microseconds and I would like to
>> group by entries with second level precision, so eventually I want to run
>> the query
>> select toSeconds(timestamp) as t, count(x) from table group by t,x
>>
>>
>>
>>
>

Reply via email to