Re: Spark SQL UDF with Struct input parameters

2016-01-13 Thread Deenar Toraskar
ut the correct name) it compiles. But when I execute it, it cannot cast >> it to the case class because obviously the data does not contain the case >> class inside. >> >> >> >> How would rewriting collect as a Spark UDAF help there? >> >> >&g

Re: Spark SQL UDF with Struct input parameters

2015-12-25 Thread Deenar Toraskar
I have found that this even does not work with a struct as an input parameter def testUDF(expectedExposures: (Float, Float))= { (expectedExposures._1 * expectedExposures._2 /expectedExposures._1) } sqlContext.udf.register("testUDF", testUDF _) sqlContext.sql("select

Spark SQL UDF with Struct input parameters

2015-12-25 Thread Deenar Toraskar
Hi I am trying to define an UDF that can take an array of tuples as input def effectiveExpectedExposure(expectedExposures: Seq[(Seq[Float], Seq[Float])])= expectedExposures.map(x=> x._1 * x._2).sum/expectedExposures.map(x=> x._1).sum sqlContext.udf.register("expectedPositiveExposure",