[ 
https://issues.apache.org/jira/browse/SPARK-29334?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Patrick Pisciuneri updated SPARK-29334:
---------------------------------------
    Description: 
pySpark supports various overloaded operators for the DenseVector type that the 
scala class does not support. 

- ML: 
https://github.com/apache/spark/blob/master/python/pyspark/ml/linalg/__init__.py#L441-L462
- MLLIB: 
https://github.com/apache/spark/blob/master/python/pyspark/mllib/linalg/__init__.py#L485-L506

We should be able to leverage the BLAS wrappers to implement these methods on 
the scala side.


  was:
pySpark supports various overloaded operators for the DenseVector type that the 
scala class does not support. 

# ML

https://github.com/apache/spark/blob/master/python/pyspark/ml/linalg/__init__.py#L441-L462

# MLLIB

https://github.com/apache/spark/blob/master/python/pyspark/mllib/linalg/__init__.py#L485-L506

We should be able to leverage the BLAS wrappers to implement these methods on 
the scala side.



> Supported vector operators in scala should have parity with pySpark 
> --------------------------------------------------------------------
>
>                 Key: SPARK-29334
>                 URL: https://issues.apache.org/jira/browse/SPARK-29334
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML, MLlib
>    Affects Versions: 2.3.5, 2.4.5, 3.0.0
>            Reporter: Patrick Pisciuneri
>            Priority: Minor
>
> pySpark supports various overloaded operators for the DenseVector type that 
> the scala class does not support. 
> - ML: 
> https://github.com/apache/spark/blob/master/python/pyspark/ml/linalg/__init__.py#L441-L462
> - MLLIB: 
> https://github.com/apache/spark/blob/master/python/pyspark/mllib/linalg/__init__.py#L485-L506
> We should be able to leverage the BLAS wrappers to implement these methods on 
> the scala side.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to