i found jira that seems related:
https://issues.apache.org/jira/browse/SPARK-25047

On Fri, Mar 29, 2019 at 4:01 PM Koert Kuipers <ko...@tresata.com> wrote:

> hi all,
> we are switching from scala 2.11 to 2.12 with a spark 2.4.1 release
> candidate and so far this has been going pretty smoothly.
>
> however we do see some new serialization errors related to Function1,
> Function2, etc.
>
> they look like this:
> ClassCastException: cannot assign instance of
> java.lang.invoke.SerializedLambda to field MyCaseClass.f of type
> scala.Function1 in instance of MyCaseClass
> at
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
>
> these are often simple case classes with inside a val for the function.
> like this:
> case class MyCaseClass[X, Y](...) {
>   val f: Function1[X, Y] = ...
> }
>
> we had no problems with these in scala 2.11. it does not look like these
> classes have members that are not serializable, and neither do the
> functions close over anything troublesome. since we get this for some
> classes but not for others i am not entirely sure what to make of it. we
> can work around the issue by changing the val f to a def, like this:
> case class MyCaseClass[X, Y](...) {
>   def f: Function1[X, Y] = ...
> }
>
> any idea what is causing this?
> thanks!
> koert
>
>
>

Reply via email to