Hi I have written spark udf and I am able to use them in spark scala / pyspark by using the org.apache.spark.sql.api.java.UDFx API.
I d'like to use them in spark-sql thought thrift. I tried to create the functions "create function as 'org.my.MyUdf'". however I get the below error when using it: > org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF > 'org.my.MyUdf'; I have read there (https://stackoverflow.com/a/56970800/3865083) that only the org.apache.hadoop.hive.ql.exec.UDF API works for thrift. How one can write UDF the good way ? Thanks -- nicolas --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org