[ https://issues.apache.org/jira/browse/SPARK-27537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-27537: --------------------------------- Description: {code} [ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object [ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value size is not a member of Object {code} ERROR: two errors found Below is the related code: {code} test("toString") { val empty = Matrices.ones(0, 0) empty.toString(0, 0) val mat = Matrices.rand(5, 10, new Random()) mat.toString(-1, -5) mat.toString(0, 0) mat.toString(Int.MinValue, Int.MinValue) mat.toString(Int.MaxValue, Int.MaxValue) var lines = mat.toString(6, 50).lines.toArray assert(lines.size == 5 && lines.forall(_.size <= 50)) lines = mat.toString(5, 100).lines.toArray assert(lines.size == 5 && lines.forall(_.size <= 100)) } {code} {code} test("numNonzeros and numActives") { val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1)) assert(dm1.numNonzeros === 3) assert(dm1.numActives === 6) val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, -1.2, 0.0)) assert(sm1.numNonzeros === 1) assert(sm1.numActives === 3) } {code} what shall i do to solve this problem, and when will spark support jdk11? was: [ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object [ERROR]: [Error] $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value size is not a member of Object ERROR: two errors found Below is the related code: 856 test("toString") { 857 val empty = Matrices.ones(0, 0) 858 empty.toString(0, 0) 859 860 val mat = Matrices.rand(5, 10, new Random()) 861 mat.toString(-1, -5) 862 mat.toString(0, 0) 863 mat.toString(Int.MinValue, Int.MinValue) 864 mat.toString(Int.MaxValue, Int.MaxValue) 865 var lines = mat.toString(6, 50).lines.toArray 866 assert(lines.size == 5 && lines.forall(_.size <= 50)) 867 868 lines = mat.toString(5, 100).lines.toArray 869 assert(lines.size == 5 && lines.forall(_.size <= 100)) 870 } 871 872 test("numNonzeros and numActives") { 873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1)) 874 assert(dm1.numNonzeros === 3) 875 assert(dm1.numActives === 6) 876 877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, -1.2, 0.0)) 878 assert(sm1.numNonzeros === 1) 879 assert(sm1.numActives === 3) what shall i do to solve this problem, and when will spark support jdk11? > spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value > size is not a member of Object > ----------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-27537 > URL: https://issues.apache.org/jira/browse/SPARK-27537 > Project: Spark > Issue Type: Bug > Components: MLlib > Affects Versions: 2.3.0, 2.4.1 > Environment: Machine:aarch64 > OS:Red Hat Enterprise Linux Server release 7.4 > Kernel:4.11.0-44.el7a > spark version: spark-2.4.1 > java:openjdk version "11.0.2" 2019-01-15 > OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.2+9) > scala:2.11.12 > gcc version:4.8.5 > Reporter: dingwei2019 > Priority: Major > Labels: build, test > > {code} > [ERROR]: [Error] > $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value > size is not a member of Object > [ERROR]: [Error] > $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value > size is not a member of Object > {code} > ERROR: two errors found > Below is the related code: > {code} > test("toString") { > val empty = Matrices.ones(0, 0) > empty.toString(0, 0) > > val mat = Matrices.rand(5, 10, new Random()) > mat.toString(-1, -5) > mat.toString(0, 0) > mat.toString(Int.MinValue, Int.MinValue) > mat.toString(Int.MaxValue, Int.MaxValue) > var lines = mat.toString(6, 50).lines.toArray > assert(lines.size == 5 && lines.forall(_.size <= 50)) > > lines = mat.toString(5, 100).lines.toArray > assert(lines.size == 5 && lines.forall(_.size <= 100)) > } > {code} > {code} > test("numNonzeros and numActives") { > val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1)) > assert(dm1.numNonzeros === 3) > assert(dm1.numActives === 6) > val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, > -1.2, 0.0)) > assert(sm1.numNonzeros === 1) > assert(sm1.numActives === 3) > } > {code} > what shall i do to solve this problem, and when will spark support jdk11? -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org