Hi

Can you try below :

We are registering using spark.sql.function.udf :

def *myUDF*(wgts: *Int*, amnt: *Float*) = (wgts*amnt)/100.asInstanceOf[
Float]

val *myUdf* = udf(myUDF(_:int,_:Float))
>
Now you can invoke the function directly in spark sql or outside.

Thanks,
Paras Sachdeva

On Wed, Apr 27, 2016 at 1:18 PM, Divya Gehlot <divya.htco...@gmail.com>
wrote:

> Hi,
> I am using Spark 1.5.2 and defined   below udf
>
> import org.apache.spark.sql.functions.udf
>> val myUdf  = (wgts : Int , amnt :Float) => {
>> (wgts*amnt)/100.asInstanceOf[Float]
>> }
>>
>
>
>
> val df2 = df1.withColumn("WEIGHTED_AMOUNT",callUDF(udfcalWghts,
> FloatType,col("RATE"),col("AMOUNT")))
>
> In my schema RATE is in integerType and Amount FLOATTYPE
>
> I am getting below error for
>
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task
>> 106 in stage 89.0 failed 4 times, most recent failure: Lost task 106.3 in
>> stage 89.0 (TID 7735, ip-xx-xx-xx-xxx.ap-southeast-1.compute.internal):
>> java.lang.ClassCastException: java.lang.Double cannot be cast to
>> java.lang.Float
>>         at scala.runtime.BoxesRunTime.unboxToFloat(BoxesRunTime.java:114)
>>         at
>> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<c
>
>
>
>
>
> Similar  issue is logged at JavaScript API for Apache Spark
> https://github.com/EclairJS/eclairjs-nashorn/issues/3
> <https://github.com/EclairJS/eclairjs-nashorn/issues/3>
>
> Can somebody help me with the resolution ?
>
>
>
>
>
>
>
>
>
>
>
> Thanks,
> Divya
>
>
>

Reply via email to