spark sql is "runtime strongly typed" meaning it must know the actual type. so this will not work
On Jan 3, 2017 07:46, "Linyuxin" <linyu...@huawei.com> wrote: > Hi all > > *With Spark 1.5.1* > > > > *When I want to implement a oracle decode function (like > decode(col1,1,’xxx’,’p2’,’yyy’,0))* > > > > *And the code may like this* > > sqlContext.udf.register("any_test", (s:AnyVal)=>{ > > if(s == null) > > null > > else > > s > > }) > > > > > > *The error shows:* > > Exception in thread "main" java.lang.UnsupportedOperationException: > Schema for type Any is not supported > > at org.apache.spark.sql.catalyst.ScalaReflection$class. > schemaFor(ScalaReflection.scala:153) > > at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor( > ScalaReflection.scala:29) > > at org.apache.spark.sql.catalyst.ScalaReflection$class. > schemaFor(ScalaReflection.scala:64) > > at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor( > ScalaReflection.scala:29) > > at org.apache.spark.sql.UDFRegistration.register( > UDFRegistration.scala:145) > > … > > > > *any suggestion?* >