[ https://issues.apache.org/jira/browse/SPARK-13944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15233142#comment-15233142 ]
DB Tsai commented on SPARK-13944: --------------------------------- There will be no converting back and forth in 2.0. Basically, `ml` package will be using its own `blas` and `vector` types. The reason why we don't want to break `mllib` is that there are people using it in production, and we don't want to break their production code. However, the adoption of `ml` is still young, and this is an experimental package; as a result, it's more reasonable to change the api there. Type alias can be an option. [~mengxr] what do you think? Thanks. > Separate out local linear algebra as a standalone module without Spark > dependency > --------------------------------------------------------------------------------- > > Key: SPARK-13944 > URL: https://issues.apache.org/jira/browse/SPARK-13944 > Project: Spark > Issue Type: New Feature > Components: Build, ML > Affects Versions: 2.0.0 > Reporter: Xiangrui Meng > Assignee: DB Tsai > Priority: Blocker > > Separate out linear algebra as a standalone module without Spark dependency > to simplify production deployment. We can call the new module > spark-mllib-local, which might contain local models in the future. > The major issue is to remove dependencies on user-defined types. > The package name will be changed from mllib to ml. For example, Vector will > be changed from `org.apache.spark.mllib.linalg.Vector` to > `org.apache.spark.ml.linalg.Vector`. The return vector type in the new ML > pipeline will be the one in ML package; however, the existing mllib code will > not be touched. As a result, this will potentially break the API. Also, when > the vector is loaded from mllib vector by Spark SQL, the vector will > automatically converted into the one in ml package. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org