[ 
https://issues.apache.org/jira/browse/SPARK-43227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-43227:
-----------------------------
    Epic Link: SPARK-43745  (was: SPARK-42554)

> Fix deserialisation issue when UDFs contain a lambda expression
> ---------------------------------------------------------------
>
>                 Key: SPARK-43227
>                 URL: https://issues.apache.org/jira/browse/SPARK-43227
>             Project: Spark
>          Issue Type: Bug
>          Components: Connect
>    Affects Versions: 3.5.0
>            Reporter: Venkata Sai Akhil Gudesa
>            Priority: Major
>
> The following code:
> {code:java}
> class A(x: Int) { def get = x * 20 + 5 }
> val dummyUdf = (x: Int) => new A(x).get
> val myUdf = udf(dummyUdf)
> spark.range(5).select(myUdf(col("id"))).as[Int].collect() {code}
> hits the following error:
> {noformat}
> io.grpc.StatusRuntimeException: INTERNAL: cannot assign instance of 
> java.lang.invoke.SerializedLambda to field 
> ammonite.$sess.cmd26$Helper.dummyUdf of type scala.Function1 in instance of 
> ammonite.$sess.cmd26$Helper
>   io.grpc.Status.asRuntimeException(Status.java:535)
>   
> io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
>   
> org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:62)
>   
> org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:114)
>   
> org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:131)
>   org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2687)
>   org.apache.spark.sql.Dataset.withResult(Dataset.scala:3088)
>   org.apache.spark.sql.Dataset.collect(Dataset.scala:2686)
>   ammonite.$sess.cmd28$Helper.<init>(cmd28.sc:1)
>   ammonite.$sess.cmd28$.<init>(cmd28.sc:7)
>   ammonite.$sess.cmd28$.<clinit>(cmd28.sc){noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to