Hi Joseph,
Thank you for your nice work and telling us the draft!
During the next development cycle, new algorithms should be contributed to
spark.mllib. Optionally, wrappers for new (and old) algorithms can be
contributed to spark.ml.
I understand that we should contribute new
Hi all,
Spark ML alpha version exists in the current master branch on Github.
If we want to add new machine learning algorithms or to modify algorithms
which already exists,
which package should we implement them at org.apache.spark.mllib or
org.apache.spark.ml?
thanks,
Yu
-
-- Yu