[ 
https://issues.apache.org/jira/browse/SPARK-24201?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16519913#comment-16519913
 ] 

vaquar khan edited comment on SPARK-24201 at 6/22/18 2:36 AM:
--------------------------------------------------------------

Let me modify  Spark doc  , need to correct Java version in doc version 
2.0,2.1,2.2,2.3 

Current doc says 

*2.0,2.1*

"Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 
2.0.0 uses Scala 2.11. You will need to use a compatible Scala version 
(2.11.x)."

2.2,2.3

We can add new

Note that current Spark version not provide support for Java 9 ,Java 10 and 
upper version

 


was (Author: vaquar.k...@gmail.com):
Let me modify  Spark doc  , need to correct Java version in doc version 
2.0,2.1,2.2,2.3 

Current doc says 

*2.0,2.1*

"Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 
2.0.0 uses Scala 2.11. You will need to use a compatible Scala version 
(2.11.x)."

2.2,2.3

We can add new

Note that current Spark not provide support for Java 9 ,Java 10 and upper 
version

 

> IllegalArgumentException originating from ClosureCleaner in Java 9+ 
> --------------------------------------------------------------------
>
>                 Key: SPARK-24201
>                 URL: https://issues.apache.org/jira/browse/SPARK-24201
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>         Environment: java version "9.0.4"
> scala version "2.11.12"
>            Reporter: Grant Henke
>            Priority: Major
>
> Apache Kudu's kudu-spark tests are failing on Java 9. 
> I assume Java 9 is supported and this is an unexpected bug given the docs say 
> "Spark runs on Java 8+" [here|https://spark.apache.org/docs/2.3.0/].
> The stacktrace seen is below:
> {code}
> java.lang.IllegalArgumentException
>         at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>         at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>         at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
>         at 
> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
>         at 
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
>         at 
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
>         at 
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
>         at 
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
>         at 
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
>         at 
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
>         at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
>         at 
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
>         at 
> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
>         at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
>         at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
>         at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
>         at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
>         at 
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
>         at 
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at 
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
>         at 
> org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
>         at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
>         at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092)
>         at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
>         at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>         at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>         at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
>         at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest$$anonfun$1.apply(KuduRDDTest.scala:30)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest$$anonfun$1.apply(KuduRDDTest.scala:27)
>         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>         at org.scalatest.Transformer.apply(Transformer.scala:22)
>         at org.scalatest.Transformer.apply(Transformer.scala:20)
>         at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>         at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>         at 
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>         at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAndAfter$$super$runTest(KuduRDDTest.scala:25)
>         at 
> org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:203)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest.runTest(KuduRDDTest.scala:25)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at 
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>         at 
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>         at 
> org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>         at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>         at org.scalatest.Suite$class.run(Suite.scala:1147)
>         at 
> org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at 
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAndAfterAll$$super$run(KuduRDDTest.scala:25)
>         at 
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>         at 
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>         at 
> org.apache.kudu.spark.kudu.KuduRDDTest.org$scalatest$BeforeAndAfter$$super$run(KuduRDDTest.scala:25)
>         at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:258)
>         at org.apache.kudu.spark.kudu.KuduRDDTest.run(KuduRDDTest.scala:25)
> {code}
> It looks like ClassReader's constructor throws an IllegalArgumentException if 
> the Java version is greater than 1.8:
> {code}
> public ClassReader(final byte[] b, final int off, final int len) {
>    this.b = b;
>    // checks the class version
>    if (readShort(off + 6) > Opcodes.V1_8) {
>       throw new IllegalArgumentException();
>    }
>    ...
> {code}
> It looks like upgrading to org.apache.xbean.asm6 would solve the issue by 
> supporting up to Java 10:
> {code}
> if (checkClassVersion && readShort(classFileOffset + 6) > Opcodes.V10) {
>    throw new IllegalArgumentException(
>       "Unsupported class file major version " + readShort(classFileOffset + 
> 6));
> }
> {code}
> The Apache Kudu test failures can be recreated by cloning the repo and 
> running the kudu-spark tests:
> {code}
> git clone https://github.com/apache/kudu.git
> cd kudu/java/kudu-spark
> ./gradlew test
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to