[ https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14371648#comment-14371648 ]
Martin Grotzke commented on SPARK-6152: --------------------------------------- I'll try to get out a new kryo version as soon as possible... > Spark does not support Java 8 compiled Scala classes > ---------------------------------------------------- > > Key: SPARK-6152 > URL: https://issues.apache.org/jira/browse/SPARK-6152 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.2.1 > Environment: Java 8+ > Scala 2.11 > Reporter: Ronald Chen > Priority: Minor > > Spark uses reflectasm to check Scala closures which fails if the *user > defined Scala closures* are compiled to Java 8 class version > The cause is reflectasm does not support Java 8 > https://github.com/EsotericSoftware/reflectasm/issues/35 > Workaround: > Don't compile Scala classes to Java 8, Scala 2.11 does not support nor > require any Java 8 features > Stack trace: > {code} > java.lang.IllegalArgumentException > at > com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown > Source) > at > com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown > Source) > at > com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown > Source) > at > org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41) > at > org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84) > at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107) > at org.apache.spark.SparkContext.clean(SparkContext.scala:1478) > at org.apache.spark.rdd.RDD.map(RDD.scala:288) > at ...my Scala 2.11 compiled to Java 8 code calling into spark > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org