Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22784#discussion_r228713925
  
    --- Diff: 
mllib/src/test/scala/org/apache/spark/mllib/feature/PCASuite.scala ---
    @@ -54,4 +55,14 @@ class PCASuite extends SparkFunSuite with 
MLlibTestSparkContext {
         // check overflowing
         assert(PCAUtil.memoryCost(40000, 60000) > Int.MaxValue)
       }
    +
    +  test("number of features more than 65500") {
    +    val rows = 10
    +    val columns = 100000
    +    val k = 5
    +    val randomRDD = RandomRDDs.normalVectorRDD(sc, rows, columns, 0, 0)
    +    val pca = new PCA(k).fit(randomRDD)
    +    assert(pca.explainedVariance.size === 5)
    +    assert(pca.pc.numRows === 100000 && pca.pc.numCols === 5)
    --- End diff --
    
    I wonder if there's any reasonable way to check the answer here, like some 
bounds on what the eigenvalues/eigenvectors should be? Like the eigenvalues 
would at least be positive?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to