> Any convenient tool to do this [sparse vector product] in Spark?

Unfortunately, it seems that there are very few operations defined for sparse 
vectors. I needed to add some, and ended up converting them to (dense) numpy 
vectors and doing the addition on those.

Best regards,
Ron


From: Xi Shen [mailto:davidshe...@gmail.com]
Sent: Friday, March 13, 2015 1:50 AM
To: user@spark.apache.org
Subject: How to do spares vector product in Spark?

Hi,

I have two RDD[Vector], both Vector are spares and of the form:

    (id, value)

"id" indicates the position of the value in the vector space. I want to apply 
dot product on two of such RDD[Vector] and get a scale value. The none exist 
values are treated as zero.

Any convenient tool to do this in Spark?


Thanks,
David

Reply via email to