[ https://issues.apache.org/jira/browse/SPARK-27537?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16822845#comment-16822845 ]
Hyukjin Kwon edited comment on SPARK-27537 at 4/25/19 6:05 AM: --------------------------------------------------------------- the question is found in spark ml test module, althrough this is an test module, i want to figure it out. from the describe above, it seems an incompatible problem between java 11 and scala 2.11.12. if I change my jdk to jdk8, and there is no problem. Below is my analysis: it seems in spark if a method has implementation in java, spark will use java method, or will use scala method. 'string' class in java11 adds the lines method. This method conflicts with the scala syntax. scala has lines method in 'stringlike' class, the method return an Iterator; Iterator in scala has a toArray method, the method return an Array; the class array in scala has a size method. so if spark use scala method, it will have no problem. lines(Iterator)\-\->toArray(Array)\-\->size But Java11 adds lines method in 'string', this will return a Stream; Stream in java11 has toArray method, and will return Object; Object has no 'size' method. This is what the error says. (Stream)\-\->(Object)toArray\-\->has no size method. what shall i do to solve this problem. was (Author: dingwei2019): the question is found in spark ml test module, althrough this is an test module, i want to figure it out. from the describe above, it seems an incompatible problem between java 11 and scala 2.11.12. if I change my jdk to jdk8, and there is no problem. Below is my analysis: it seems in spark if a method has implementation in java, spark will use java method, or will use scala method. 'string' class in java11 adds the lines method. This method conflicts with the scala syntax. scala has lines method in 'stringlike' class, the method return an Iterator; Iterator in scala has a toArray method, the method return an Array; the class array in scala has a size method. so if spark use scala method, it will have no problem. lines(Iterator)-->toArray(Array)-->size But Java11 adds lines method in 'string', this will return a Stream; Stream in java11 has toArray method, and will return Object; Object has no 'size' method. This is what the error says. (Stream)-->(Object)toArray-->has no size method. what shall i do to solve this problem. > spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value > size is not a member of Object > ----------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-27537 > URL: https://issues.apache.org/jira/browse/SPARK-27537 > Project: Spark > Issue Type: Bug > Components: MLlib > Affects Versions: 2.3.0, 2.4.1 > Environment: Machine:aarch64 > OS:Red Hat Enterprise Linux Server release 7.4 > Kernel:4.11.0-44.el7a > spark version: spark-2.4.1 > java:openjdk version "11.0.2" 2019-01-15 > OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.2+9) > scala:2.11.12 > gcc version:4.8.5 > Reporter: dingwei2019 > Priority: Major > Labels: build, test > > {code} > [ERROR]: [Error] > $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value > size is not a member of Object > [ERROR]: [Error] > $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value > size is not a member of Object > {code} > ERROR: two errors found > Below is the related code: > {code} > test("toString") { > val empty = Matrices.ones(0, 0) > empty.toString(0, 0) > > val mat = Matrices.rand(5, 10, new Random()) > mat.toString(-1, -5) > mat.toString(0, 0) > mat.toString(Int.MinValue, Int.MinValue) > mat.toString(Int.MaxValue, Int.MaxValue) > var lines = mat.toString(6, 50).lines.toArray > assert(lines.size == 5 && lines.forall(_.size <= 50)) > > lines = mat.toString(5, 100).lines.toArray > assert(lines.size == 5 && lines.forall(_.size <= 100)) > } > {code} > {code} > test("numNonzeros and numActives") { > val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1)) > assert(dm1.numNonzeros === 3) > assert(dm1.numActives === 6) > val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, > -1.2, 0.0)) > assert(sm1.numNonzeros === 1) > assert(sm1.numActives === 3) > } > {code} > what shall i do to solve this problem, and when will spark support jdk11? -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org