Hi, I’d like to use a UDF in pyspark 2.0. As in .. ________
def squareIt(x): return x * x # register the function and define return type …. spark.sql(“”"select myUdf(adgroupid, 'extra_string_parameter') as function_result from df’) _________ How can I register the function? I only see registerFunction in the deprecated sqlContext at http://spark.apache.org/docs/2.0.0/api/python/pyspark.sql.html <http://spark.apache.org/docs/2.0.0/api/python/pyspark.sql.html>. As the ‘spark’ object unifies hiveContext and sqlContext, what is the new way to go? Ben