On Thu, Feb 18, 2016 at 11:18:44PM +, Kabeer Ahmed wrote:
> I use Spark 1.5 with CDH5.5 distribution and I see that support is
> present for UDAF. From the link:
> https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html,
Richard:
Please see SPARK-9664 Use sqlContext.udf to register UDAFs
Cheers
On Thu, Feb 18, 2016 at 3:18 PM, Kabeer Ahmed
wrote:
> I use Spark 1.5 with CDH5.5 distribution and I see that support is present
> for UDAF. From the link:
>
I use Spark 1.5 with CDH5.5 distribution and I see that support is present for
UDAF. From the link:
https://databricks.com/blog/2015/09/16/spark-1-5-dataframe-api-highlights-datetimestring-handling-time-intervals-and-udafs.html,
I read that this is an experimental feature. So it makes sense not
I'm working on an application using DataFrames (Scala API) in Spark 1.5.0,
and we need to define and use several custom aggregators. I'm having
trouble figuring out how to do this, however.
First, which version of Spark did UDAF support land in? Has it in fact
landed at all?