Github user jkbradley commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3643#discussion_r21996774
  
    --- Diff: mllib/src/main/scala/org/apache/spark/mllib/util/MLUtils.scala ---
    @@ -264,6 +263,84 @@ object MLUtils {
         }
         Vectors.fromBreeze(vector1)
       }
    + 
    +  /**
    +   * Returns the squared distance between two Vectors.
    +   */
    +  private[util] def vectorSquaredDistance(v1: Vector, v2: Vector): Double 
= {
    +    var squaredDistance = 0.0
    +    (v1, v2) match { 
    +      case (v1: SparseVector, v2: SparseVector) =>
    +        val v1Values = v1.values
    +        val v1Indices = v1.indices
    +        val v2Values = v2.values
    +        val v2Indices = v2.indices
    +        val nnzv1 = v1Indices.size
    +        val nnzv2 = v2Indices.size
    +        
    +        var kv1 = 0
    +        var kv2 = 0
    +        while (kv1 < nnzv1 || kv2 < nnzv2) {
    +          var score = 0.0
    + 
    +          if (kv2 >= nnzv2 || (kv1 < nnzv1 && v1Indices(kv1) < 
v2Indices(kv2))) {
    +            score = v1Values(kv1)
    +            kv1 += 1
    +          } else if (kv1 >= nnzv1 || (kv2 < nnzv2 && v2Indices(kv2) < 
v1Indices(kv1))) {
    +            score = v2Values(kv2)
    +            kv2 += 1
    +          } else if (v1Indices(kv1) == v2Indices(kv2)) {
    --- End diff --
    
    Apologies, I should have actually said there's no need to test anything 
here since ```v1Indices(kv1) == v2Indices(kv2)``` will always be true as well!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to