Github user WeichenXu123 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20964#discussion_r178780101
  
    --- Diff: 
mllib/src/test/scala/org/apache/spark/ml/feature/MaxAbsScalerSuite.scala ---
    @@ -45,9 +44,9 @@ class MaxAbsScalerSuite extends SparkFunSuite with 
MLlibTestSparkContext with De
           .setOutputCol("scaled")
     
         val model = scaler.fit(df)
    -    model.transform(df).select("expected", "scaled").collect()
    -      .foreach { case Row(vector1: Vector, vector2: Vector) =>
    -      assert(vector1.equals(vector2), s"MaxAbsScaler ut error: $vector2 
should be $vector1")
    +    testTransformer[(Vector, Vector)](df, model, "expected", "scaled") {
    +      case Row(vector1: Vector, vector2: Vector) =>
    +        assert(vector1.equals(vector2), s"MaxAbsScaler error: $vector2 
should be $vector1")
    --- End diff --
    
    `vector1 === vector2`
    and the error message looks strange. ==> s"scaled value $vector2 do not 
equal expected value "$vector1" ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to