Hi, We are in the process of developing a new product/Spark application. While the official Spark 1.4.1 page <http://spark.apache.org/docs/latest/ml-guide.html> invites users and developers to use *Spark.mllib* and optionally contribute to *Spark.ml*, this <http://stackoverflow.com/questions/30231840/difference-between-org-apache-spark-ml-classification-and-org-apache-spark-mllib> StackOverflow post refers to the /design doc/, saying the Spark.mllib will be deprecated eventually.
Could you please confirm which of these is true, and if we need to worry if we are planning to develop the app using Spark.mlli? What would be the timeline for this migration? Thanks in advance, Nikhil -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ml-vs-Spark-mllib-tp24465.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org