Github user squito commented on the issue: https://github.com/apache/spark/pull/19280 > Looks ok to me, assuming the "default serializer" in SerializerManager is configured correctly through other means. I think that part is fine. The serializer is created here: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkEnv.scala#L279 The same instance is assigned to `SparkEnv.serializer`: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkEnv.scala#L374 Which has its default classloader set in Executor.scala, right by the part I'm changing: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/Executor.scala#L131
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org