[ https://issues.apache.org/jira/browse/SPARK-12414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15075814#comment-15075814 ]
Andrew Or edited comment on SPARK-12414 at 12/31/15 7:51 AM: ------------------------------------------------------------- It's also for code cleanup. Right now SparkEnv has a "closure serializer" and a "serializer", which is kind of confusing. We should just use Java serializer since it's worked for such a long time. I don't know much about Kryo 3.0 but I'm not sure if upgrading would be sufficient. was (Author: andrewor14): It's also for code cleanup. Right now SparkEnv has a "closure serializer" and a "serializer", which is kind of confusing. We should just use Java serializer since it's worked for such a long time. Not sure about Kryo 3.0 but I'm not sure if upgrading would be sufficient. > Remove closure serializer > ------------------------- > > Key: SPARK-12414 > URL: https://issues.apache.org/jira/browse/SPARK-12414 > Project: Spark > Issue Type: Sub-task > Components: Spark Core > Affects Versions: 1.0.0 > Reporter: Andrew Or > Assignee: Andrew Or > > There is a config `spark.closure.serializer` that accepts exactly one value: > the java serializer. This is because there are currently bugs in the Kryo > serializer that make it not a viable candidate. This was uncovered by an > unsuccessful attempt to make it work: SPARK-7708. > My high level point is that the Java serializer has worked well for at least > 6 Spark versions now, and it is an incredibly complicated task to get other > serializers (not just Kryo) to work with Spark's closures. IMO the effort is > not worth it and we should just remove this documentation and all the code > associated with it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org