It's a bug in breeze's side. Once David fixes it and publishes it to
maven, we can upgrade to breeze 0.11.2. Please file a jira ticket for
this issue. thanks.

Sincerely,

DB Tsai
-------------------------------------------------------
Blog: https://www.dbtsai.com


On Sun, Mar 15, 2015 at 12:45 AM, Yu Ishikawa
<yuu.ishikawa+sp...@gmail.com> wrote:
> Hi all,
>
> Is there any bugs to divide a Breeze sparse vector at Spark v1.3.0-rc3? When
> I tried to divide a sparse vector at Spark v1.3.0-rc3, I got a wrong result
> if the target vector has any zero values.
>
> Spark v1.3.0-rc3 depends on Breeze v0.11.1. And Breeze v0.11.1 seems to have
> any bugs to divide a sparse vector by a scalar value. When dividing a breeze
> sparse vector which has any zero values, the result seems to be a zero
> vector. However, we can run the same code on Spark v1.2.x.
>
> However, there is no problem to multiply a breeze sparse vector. I asked the
> breeze community this problem on the below issue.
> https://github.com/scalanlp/breeze/issues/382
>
> For example,
> ```
> test("dividing a breeze spark vector") {
>     val vec = Vectors.sparse(6, Array(0, 4), Array(0.0, 10.0)).toBreeze
>     val n = 60.0
>     val answer1 = vec :/ n
>     val answer2 = vec.toDenseVector :/ n
>     println(vec)
>     println(answer1)
>     println(answer2)
>     assert(answer1.toDenseVector === answer2)
> }
>
> SparseVector((0,0.0), (4,10.0))
> SparseVector()
> DenseVector(0.0, 0.0, 0.0, 0.0, 0.16666666666666666, 0.0)
>
> DenseVector(0.0, 0.0, 0.0, 0.0, 0.0, 0.0) did not equal DenseVector(0.0,
> 0.0, 0.0, 0.0, 0.16666666666666666, 0.0)
> org.scalatest.exceptions.TestFailedException: DenseVector(0.0, 0.0, 0.0,
> 0.0, 0.0, 0.0) did not equal DenseVector(0.0, 0.0, 0.0, 0.0,
> 0.16666666666666666, 0.0)
> ```
>
> Thanks,
> Yu Ishikawa
>
>
>
> -----
> -- Yu Ishikawa
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/mllib-Is-there-any-bugs-to-divide-a-Breeze-sparse-vectors-at-Spark-v1-3-0-rc3-tp11056.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to