Thanks!  I used sbt (command below) and the .so file is now there (shown
below).  Now that I have this new assembly.jar, how do I run the spark-shell
so that it can see the .so file when I call the kmeans function?  Thanks
again for your help with this.

sbt/sbt -Dhadoop.version=2.4.0 -Pyarn -Phive -Pnetlib-lgpl clean update
assembly

jar tf assembly/target/scala-2.10/spark-assembly-1.1.0-hadoop2.4.0.jar |
grep netlib-native_system-linux-x86_64.so

I am using Centos 6.5 with Java 7 (/usr/lib/jvm/jre-1.7.0-openjdk.x86_64)






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660p18803.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to