Hi Pala,
Can you add the full stacktrace of the exception? For now, can you use
create temporary function to workaround the issue?
Thanks,
Yin
On Wed, Sep 30, 2015 at 11:01 AM, Pala M Muthaia <
mchett...@rocketfuelinc.com.invalid> wrote:
> +user list
>
> On Tue, Sep 29, 2015 at 3:43 PM, Pala
Thanks for getting back Yin. I have copied the stack below. The associated
query is just this: "hc.sql("select murmurhash3('abc') from dual")". The
UDF murmurhash3 is already available in our hive metastore.
Regarding temporary function, can i create a temp function with existing
Hive UDF code,
Yes. You can use create temporary function to create a function based on a
Hive UDF (
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-Create/Drop/ReloadFunction
).
Regarding the error, I think the problem is that starting from Spark 1.4,
we have two separate
+user list
On Tue, Sep 29, 2015 at 3:43 PM, Pala M Muthaia wrote:
> Hi,
>
> I am trying to use internal UDFs that we have added as permanent functions
> to Hive, from within Spark SQL query (using HiveContext), but i encounter
> NoSuchObjectException, i.e. the