I am calling the Spark Dataset API (map method) and getting exceptions on 
deserialization of task results. I am calling this API from Clojure using 
standard JVM interop syntax.

This gist has a tiny Clojure program that shows the problem, as well as the 
corresponding (working) Scala implementation. There is also a full stack trace 
and Spark logs for a run of the Clojure code. 
https://gist.github.com/erp12/233a60574dc157aa544079959108f9db

Interestingly, the exception is not raised if the Clojure code is first 
compiled to bytecode, or if it is executed from a REPL. It seems that, 
depending on the context, the Spark deserializer is improperly deserializing a 
Function3 as a SerializedLambda. 

Does anyone have any ideas on why is this happening? Is this a Spark bug?

Reply via email to