[jira] [Updated] (SPARK-27537) spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object

2019-04-25 Thread Hyukjin Kwon (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-27537:
-
Description: 
{code}
[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
 size is not a member of Object
[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
 size is not a member of Object
{code}

ERROR: two errors found

Below is the related code:

{code}
test("toString") {
  val empty = Matrices.ones(0, 0)
  empty.toString(0, 0)
 
  val mat = Matrices.rand(5, 10, new Random())
  mat.toString(-1, -5)
  mat.toString(0, 0)
  mat.toString(Int.MinValue, Int.MinValue)
  mat.toString(Int.MaxValue, Int.MaxValue)
  var lines = mat.toString(6, 50).lines.toArray
  assert(lines.size == 5 && lines.forall(_.size <= 50))
 
  lines = mat.toString(5, 100).lines.toArray
  assert(lines.size == 5 && lines.forall(_.size <= 100))
}
{code}

{code}
test("numNonzeros and numActives") {
  val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
  assert(dm1.numNonzeros === 3)
  assert(dm1.numActives === 6)

  val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, 
-1.2, 0.0))
  assert(sm1.numNonzeros === 1)
  assert(sm1.numActives === 3)
}
{code}


what shall i do to solve this problem, and when will spark support jdk11?


  was:
[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
 size is not a member of Object

[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
 size is not a member of Object

ERROR: two errors found

Below is the related code:
856   test("toString") {
857 val empty = Matrices.ones(0, 0)
858 empty.toString(0, 0)
859 
860 val mat = Matrices.rand(5, 10, new Random())
861 mat.toString(-1, -5)
862 mat.toString(0, 0)
863 mat.toString(Int.MinValue, Int.MinValue)
864 mat.toString(Int.MaxValue, Int.MaxValue)
865 var lines = mat.toString(6, 50).lines.toArray
866 assert(lines.size == 5 && lines.forall(_.size <= 50))
867 
868 lines = mat.toString(5, 100).lines.toArray
869 assert(lines.size == 5 && lines.forall(_.size <= 100))
870   }
871 
872   test("numNonzeros and numActives") {
873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
874 assert(dm1.numNonzeros === 3)
875 assert(dm1.numActives === 6)
876 
877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), 
Array(0.0, -1.2, 0.0))
878 assert(sm1.numNonzeros === 1)
879 assert(sm1.numActives === 3)


what shall i do to solve this problem, and when will spark support jdk11?



> spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> -
>
> Key: SPARK-27537
> URL: https://issues.apache.org/jira/browse/SPARK-27537
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 2.3.0, 2.4.1
> Environment: Machine:aarch64
> OS:Red Hat Enterprise Linux Server release 7.4
> Kernel:4.11.0-44.el7a
> spark version: spark-2.4.1
> java:openjdk version "11.0.2" 2019-01-15
>           OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.2+9)
> scala:2.11.12
> gcc version:4.8.5
>Reporter: dingwei2019
>Priority: Major
>  Labels: build, test
>
> {code}
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
>  size is not a member of Object
> {code}
> ERROR: two errors found
> Below is the related code:
> {code}
> test("toString") {
>   val empty = Matrices.ones(0, 0)
>   empty.toString(0, 0)
>  
>   val mat = Matrices.rand(5, 10, new Random())
>   mat.toString(-1, -5)
>   mat.toString(0, 0)
>   mat.toString(Int.MinValue, Int.MinValue)
>   mat.toString(Int.MaxValue, Int.MaxValue)
>   var lines = mat.toString(6, 50).lines.toArray
>   assert(lines.size == 5 && lines.forall(_.size <= 50))
>  
>   lines = mat.toString(5, 100).lines.toArray
>   assert(lines.size == 5 && lines.forall(_.size <= 100))
> }
> {code}
> {code}
> test("numNonzeros and numActives") {
>   val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
>   assert(dm1.numNonzeros === 3)
>   assert(dm1.numActives === 6)
>   val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), Array(0.0, 
> -1.2, 0.0))
>   assert(sm1.numNonzeros === 1)
>   assert(sm1.numActives === 3)
> }
> {code}
> what shall i do to solve this problem, and when will spark support jdk11?

[jira] [Updated] (SPARK-27537) spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object

2019-04-21 Thread dingwei2019 (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

dingwei2019 updated SPARK-27537:

Description: 
[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
 size is not a member of Object

[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
 size is not a member of Object

ERROR: two errors found

Below is the related code:
856   test("toString") {
857 val empty = Matrices.ones(0, 0)
858 empty.toString(0, 0)
859 
860 val mat = Matrices.rand(5, 10, new Random())
861 mat.toString(-1, -5)
862 mat.toString(0, 0)
863 mat.toString(Int.MinValue, Int.MinValue)
864 mat.toString(Int.MaxValue, Int.MaxValue)
865 var lines = mat.toString(6, 50).lines.toArray
866 assert(lines.size == 5 && lines.forall(_.size <= 50))
867 
868 lines = mat.toString(5, 100).lines.toArray
869 assert(lines.size == 5 && lines.forall(_.size <= 100))
870   }
871 
872   test("numNonzeros and numActives") {
873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
874 assert(dm1.numNonzeros === 3)
875 assert(dm1.numActives === 6)
876 
877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), 
Array(0.0, -1.2, 0.0))
878 assert(sm1.numNonzeros === 1)
879 assert(sm1.numActives === 3)


what shall i do to solve this problem, and when will spark support jdk11?


  was:
[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
 size is not a member of Object

[ERROR]: [Error] 
$SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
 size is not a member of Object

ERROR: two errors found

Below is the related code:
856   test("toString") {
857 val empty = Matrices.ones(0, 0)
858 empty.toString(0, 0)
859 
860 val mat = Matrices.rand(5, 10, new Random())
861 mat.toString(-1, -5)
862 mat.toString(0, 0)
863 mat.toString(Int.MinValue, Int.MinValue)
864 mat.toString(Int.MaxValue, Int.MaxValue)
865 var lines = mat.toString(6, 50).lines.toArray
866 assert(lines.size == 5 && lines.forall(_.size <= 50))
867 
868 lines = mat.toString(5, 100).lines.toArray
869 assert(lines.size == 5 && lines.forall(_.size <= 100))
870   }
871 
872   test("numNonzeros and numActives") {
873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
874 assert(dm1.numNonzeros === 3)
875 assert(dm1.numActives === 6)
876 
877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), 
Array(0.0, -1.2, 0.0))
878 assert(sm1.numNonzeros === 1)
879 assert(sm1.numActives === 3)





> spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> -
>
> Key: SPARK-27537
> URL: https://issues.apache.org/jira/browse/SPARK-27537
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 2.3.0, 2.4.1
> Environment: Machine:aarch64
> OS:Red Hat Enterprise Linux Server release 7.4
> Kernel:4.11.0-44.el7a
> spark version: spark-2.4.1
> java:openjdk version "11.0.2" 2019-01-15
>           OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.2+9)
> scala:2.11.12
> gcc version:4.8.5
>Reporter: dingwei2019
>Priority: Major
>  Labels: build, test
>
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
>  size is not a member of Object
> ERROR: two errors found
> Below is the related code:
> 856   test("toString") {
> 857 val empty = Matrices.ones(0, 0)
> 858 empty.toString(0, 0)
> 859 
> 860 val mat = Matrices.rand(5, 10, new Random())
> 861 mat.toString(-1, -5)
> 862 mat.toString(0, 0)
> 863 mat.toString(Int.MinValue, Int.MinValue)
> 864 mat.toString(Int.MaxValue, Int.MaxValue)
> 865 var lines = mat.toString(6, 50).lines.toArray
> 866 assert(lines.size == 5 && lines.forall(_.size <= 50))
> 867 
> 868 lines = mat.toString(5, 100).lines.toArray
> 869 assert(lines.size == 5 && lines.forall(_.size <= 100))
> 870   }
> 871 
> 872   test("numNonzeros and numActives") {
> 873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
> 874 assert(dm1.numNonzeros === 3)
> 875 assert(dm1.numActives === 6)
> 876 
> 877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), 
> Array(0.0, -1.2, 0.0))
> 878 assert(sm1.numNonzeros === 1)
> 879 

[jira] [Updated] (SPARK-27537) spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object

2019-04-21 Thread dingwei2019 (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-27537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

dingwei2019 updated SPARK-27537:

Docs Text:   (was: the question is found in spark ml test module, althrough 
this is an test module, i want to figure it out.
from the describe above, it seems an incompatible problem between java 11 and 
scala 2.11.12.
if I change my jdk to jdk8, and there is no problem.
Below is my analysis:

it seems in spark if  a method has implementation in java, spark will use java 
method, or will use scala method.
 'string' class in java11 adds the lines method. This method conflicts with the 
scala syntax.

scala has lines method in 'stringlike' class, the method return an Iterator;
Iterator in scala has a toArray method, the method return an Array;
the class array in scala has a size method. so if spark use scala method, it 
will have no problem.
lines(Iterator)-->toArray(Array)-->size

But Java11 adds lines method in 'string', this will return a Stream;
Stream in java11 has toArray method, and will return Object;
Object has no 'size' method. This is what the error says.
(Stream)-->(Object)toArray-->has no size method.)

> spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> -
>
> Key: SPARK-27537
> URL: https://issues.apache.org/jira/browse/SPARK-27537
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 2.3.0, 2.4.1
> Environment: Machine:aarch64
> OS:Red Hat Enterprise Linux Server release 7.4
> Kernel:4.11.0-44.el7a
> spark version: spark-2.4.1
> java:openjdk version "11.0.2" 2019-01-15
>           OpenJDK Runtime Environment AdoptOpenJDK (build 11.0.2+9)
> scala:2.11.12
> gcc version:4.8.5
>Reporter: dingwei2019
>Priority: Major
>  Labels: build, test
>
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value
>  size is not a member of Object
> [ERROR]: [Error] 
> $SPARK_HOME/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.869:Value
>  size is not a member of Object
> ERROR: two errors found
> Below is the related code:
> 856   test("toString") {
> 857 val empty = Matrices.ones(0, 0)
> 858 empty.toString(0, 0)
> 859 
> 860 val mat = Matrices.rand(5, 10, new Random())
> 861 mat.toString(-1, -5)
> 862 mat.toString(0, 0)
> 863 mat.toString(Int.MinValue, Int.MinValue)
> 864 mat.toString(Int.MaxValue, Int.MaxValue)
> 865 var lines = mat.toString(6, 50).lines.toArray
> 866 assert(lines.size == 5 && lines.forall(_.size <= 50))
> 867 
> 868 lines = mat.toString(5, 100).lines.toArray
> 869 assert(lines.size == 5 && lines.forall(_.size <= 100))
> 870   }
> 871 
> 872   test("numNonzeros and numActives") {
> 873 val dm1 = Matrices.dense(3, 2, Array(0, 0, -1, 1, 0, 1))
> 874 assert(dm1.numNonzeros === 3)
> 875 assert(dm1.numActives === 6)
> 876 
> 877 val sm1 = Matrices.sparse(3, 2, Array(0, 2, 3), Array(0, 2, 1), 
> Array(0.0, -1.2, 0.0))
> 878 assert(sm1.numNonzeros === 1)
> 879 assert(sm1.numActives === 3)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org