[ https://issues.apache.org/jira/browse/SPARK-14221?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15477903#comment-15477903 ]
Jakob Odersky edited comment on SPARK-14221 at 9/9/16 6:30 PM: --------------------------------------------------------------- [~joshrosen]'s upstream PR requires Kryo 3.1, a version that was in the works when the PR got created but that was never published (and is hence a major blocker for Chill). Instead, Kryo went straight to version 4.0.0 (see changes [here|https://github.com/EsotericSoftware/kryo#new-in-release-400]). Would a transitive dependency on Kryo 4.0.0 be acceptable in Spark? Of course, updating the Kryo version in Chill, in order to support Scala 2.12 will also need discussion upstream. was (Author: jodersky): [~joshrosen]'s upstream PR requires Kryo 3.1, a version that was in the works when the PR got created but that was never published. Instead, Kryo went straight to version 4.0.0 (see changes [here|https://github.com/EsotericSoftware/kryo#new-in-release-400]). Would a transitive dependency on Kryo 4.0.0 be acceptable in Spark? Of course, updating the Kryo version in Chill, in order to support Scala 2.12 will also need discussion upstream. > Cross-publish Chill for Scala 2.12 > ---------------------------------- > > Key: SPARK-14221 > URL: https://issues.apache.org/jira/browse/SPARK-14221 > Project: Spark > Issue Type: Sub-task > Components: Build > Reporter: Josh Rosen > Assignee: Josh Rosen > > We need to cross-publish Chill in order to build against Scala 2.12. > Upstream issue: https://github.com/twitter/chill/issues/252 > I tried building and testing {{chill-scala}} against 2.12.0-M3 and ran into > multiple failed tests due to issues with Java8 lambda serialization (similar > to https://github.com/EsotericSoftware/kryo/issues/215), so this task will be > slightly more involved then just bumping the dependencies in the Chill build. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org