ntiate, by reflection,
> the UDF and UDFFromUnixTime and all went well.
>
> I already tested having all the dependencies' jars in one directory on all
> hosts and adding that to the spark.executor.extraClassPath and
> spark.driver.extraClassPath: no luck either.
> At
ccurs when returning data back to the driver
(because of the "ResultTask" seen in the stacktrace).
Does anyone had such a similar issue?
Regards.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Unable-to-use-Hive-UDF-because-of-Clas