GitHub user mgaido91 opened a pull request:

    https://github.com/apache/spark/pull/19685

    [SPARK-19759][ML] not using blas in ALSModel.predict for optimization

    ## What changes were proposed in this pull request?
    
    In `ALS.predict` currently we are using `blas.sdot` function to perform a 
dot product on two `Seq`s. It turns out that this is not the most efficient way.
    
    I used the following code to compare the implementations:
    
    ```
    def time[R](block: => R): Unit = {
        val t0 = System.nanoTime()
        block
        val t1 = System.nanoTime()
        println("Elapsed time: " + (t1 - t0) + "ns")
    }
    val r = new scala.util.Random(100)
    val input = (1 to 500000).map(_ => (1 to 100).map(_ => r.nextFloat).toSeq)
    def f(a:Seq[Float], b:Seq[Float]): Float = {
        var r = 0.0f
        for(i <- 0 until a.length) {
            r+=a(i)*b(i)
        }
        r
    }
    import com.github.fommil.netlib.BLAS.{getInstance => blas}
    val b = (1 to 100).map(_ => r.nextFloat).toSeq
    time { input.foreach(a=>blas.sdot(100, a.toArray, 1, b.toArray, 1)) }
    // on average it takes 2968718815 ns
    time { input.foreach(a=>f(a,b)) }
    // on average it takes 515510185 ns
    ``` 
    
    Thus this PR proposes the old-style for loop implementation for performance 
reasons.
    
    ## How was this patch tested?
    
    existing UTs


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/mgaido91/spark SPARK-19759

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19685.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19685
    
----
commit 8b0add68ddf68939c0f5ac19836f9b3b9cc58432
Author: Marco Gaido <mga...@hortonworks.com>
Date:   2017-11-07T16:29:23Z

    [SPARK-19759][ML] not using blas in ALSModel.predict for optimization

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to