We automatically convert types for UDFs defined in Scala, but we can't do
it in Java because the types are erased by the compiler.  If you want to
use double you should cast before calling the UDF.

On Wed, Jan 13, 2016 at 8:10 PM, Raghu Ganti <raghuki...@gmail.com> wrote:

> So, when I try BigDecimal, it works. But, should it not parse based on
> what the UDF defines? Am I missing something here?
>
> On Wed, Jan 13, 2016 at 4:57 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Please take a look
>> at 
>> sql/hive/src/test/java/org/apache/spark/sql/hive/aggregate/MyDoubleSum.java
>> which shows a UserDefinedAggregateFunction that works on DoubleType column.
>>
>> sql/hive/src/test/java/org/apache/spark/sql/hive/JavaDataFrameSuite.java
>> shows how it is registered.
>>
>> Cheers
>>
>> On Wed, Jan 13, 2016 at 11:58 AM, raghukiran <raghuki...@gmail.com>
>> wrote:
>>
>>> While registering and using SQL UDFs, I am running into the following
>>> problem:
>>>
>>> UDF registered:
>>>
>>> ctx.udf().register("Test", new UDF1<Double, String>() {
>>>                         /**
>>>                          *
>>>                          */
>>>                         private static final long serialVersionUID =
>>> -8231917155671435931L;
>>>
>>>                         public String call(Double x) throws Exception {
>>>                                 return "testing";
>>>                         }
>>>                 }, DataTypes.StringType);
>>>
>>> Usage:
>>> query = "SELECT Test(82.4)";
>>>                 result = sqlCtx.sql(query).first();
>>>                 System.out.println(result.toString());
>>>
>>> Problem: Class Cast exception thrown
>>> Caused by: java.lang.ClassCastException: java.math.BigDecimal cannot be
>>> cast
>>> to java.lang.Double
>>>
>>> This problem occurs with Spark v1.5.2 and 1.6.0.
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/SQL-UDF-problem-with-re-to-types-tp25968.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to