Hi All,

I have all my UDFs defined in the classes residing in a different package
than where I am instantiating my HiveContext.

I have a register function in my UDF class. I pass HiveContext to this
function. and in this function I call
"hiveContext.registerFunction("myudf", myudf _)"

All goes well but at the runtime when I execute query "val sqlresult =
hiveContext.sql(query)"
It doesn't work. sqlresult comes as null. There is no exception thrown by
spark and there is no proper logs indicating the error.

But all goes well if I bring my UDF class into the same package where I am
instantiating hiveContext.

I dig more into the spark code and found that (may be I am wrong here)

./sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala

has a class named as SimpleFunctionRegistry wherein lookupFunction is not
throwing error if it doesn't find a function.


Kindly help


Appreciate

Ravi.

Reply via email to