Hi John, I have been using MLLIB without installing jblas native dependence. Functionally I have not got stuck. I still need to explore if there are any performance hits.
Best Regards, Sonal Founder, Nube Technologies <http://www.nubetech.co> <http://in.linkedin.com/in/sonalgoyal> On Fri, May 8, 2015 at 9:34 PM, John Niekrasz <john.niekr...@gmail.com> wrote: > Newbie question... > > Can I use any of the main ML capabilities of MLlib in a Java-only > environment, without any native library dependencies? > > According to the documentation, java-netlib provides a JVM fallback. This > suggests that native netlib libraries are not required. > > It appears that such a fallback is not available for jblas. However, a > quick > look at the MLlib source suggests that MLlib's dependencies on jblas are > rather isolated: > > > grep -R jblas > main/scala/org/apache/spark/ml/recommendation/ALS.scala:import > org.jblas.DoubleMatrix > main/scala/org/apache/spark/mllib/optimization/NNLS.scala:import > org.jblas.{DoubleMatrix, SimpleBlas} > > main/scala/org/apache/spark/mllib/recommendation/MatrixFactorizationModel.scala:import > org.jblas.DoubleMatrix > main/scala/org/apache/spark/mllib/util/LinearDataGenerator.scala:import > org.jblas.DoubleMatrix > main/scala/org/apache/spark/mllib/util/LinearDataGenerator.scala: > org.jblas.util.Random.seed(42) > main/scala/org/apache/spark/mllib/util/MFDataGenerator.scala:import > org.jblas.DoubleMatrix > main/scala/org/apache/spark/mllib/util/SVMDataGenerator.scala:import > org.jblas.DoubleMatrix > > Is it true or false that many of MLlib's capabilities will work perfectly > fine without any native (non-Java) libraries installed at all? > > Thanks for the help, > John > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/dependencies-on-java-netlib-and-jblas-tp22818.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >