Re: MLLIB usage: BLAS dependency warning
Hi Xiangrui, thank you very much for your response. I looked for the .so as you suggested. It is not here: $ jar tf assembly/target/spark-assembly_2.10-1.1.0-dist/spark-assembly-1.1.0-hadoop2.4.0.jar | grep netlib-native_system-linux-x86_64.so or here: $ jar tf assembly/target/spark-assembly_2.10-1.1.0-dist/spark-mllib_2.10-1.1.0.jar | grep netlib-native_system-linux-x86_64.so However, I do find it here: $ jar tf /root/.m2/repository/com/github/fommil/netlib/netlib-native_system-linux-x86_64/1.1/netlib-native_system-linux-x86_64-1.1-natives.jar | grep netlib-native_system-linux-x86_64.so Am I not building it correctly? Should I just add the above jar to the Spark classpath (if so, where exactly do I add that, I tried adding to .extraClassPath but did not help)? Thanks a lot, jeff -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660p18775.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: MLLIB usage: BLAS dependency warning
That means the -Pnetlib-lgpl option didn't work. Could you use sbt to build the assembly jar and see whether the .so file is inside the assembly jar? Which system and Java version are you using? -Xiangrui On Wed, Nov 12, 2014 at 2:22 PM, jpl jlefe...@soe.ucsc.edu wrote: Hi Xiangrui, thank you very much for your response. I looked for the .so as you suggested. It is not here: $ jar tf assembly/target/spark-assembly_2.10-1.1.0-dist/spark-assembly-1.1.0-hadoop2.4.0.jar | grep netlib-native_system-linux-x86_64.so or here: $ jar tf assembly/target/spark-assembly_2.10-1.1.0-dist/spark-mllib_2.10-1.1.0.jar | grep netlib-native_system-linux-x86_64.so However, I do find it here: $ jar tf /root/.m2/repository/com/github/fommil/netlib/netlib-native_system-linux-x86_64/1.1/netlib-native_system-linux-x86_64-1.1-natives.jar | grep netlib-native_system-linux-x86_64.so Am I not building it correctly? Should I just add the above jar to the Spark classpath (if so, where exactly do I add that, I tried adding to .extraClassPath but did not help)? Thanks a lot, jeff -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660p18775.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: MLLIB usage: BLAS dependency warning
Thanks! I used sbt (command below) and the .so file is now there (shown below). Now that I have this new assembly.jar, how do I run the spark-shell so that it can see the .so file when I call the kmeans function? Thanks again for your help with this. sbt/sbt -Dhadoop.version=2.4.0 -Pyarn -Phive -Pnetlib-lgpl clean update assembly jar tf assembly/target/scala-2.10/spark-assembly-1.1.0-hadoop2.4.0.jar | grep netlib-native_system-linux-x86_64.so I am using Centos 6.5 with Java 7 (/usr/lib/jvm/jre-1.7.0-openjdk.x86_64) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660p18803.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: MLLIB usage: BLAS dependency warning
Hi Xiangrui, All is well. Got it working now, I just recompiled with sbt with the additional package flag and that created all the /bin files. Then when I start spark-shell, the webUI environment show the assembly jar is in spark's classpath entries and now the kmeans function finds it -- no more WARN messages. Thank you very much. Best, jeff -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660p18819.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
MLLIB usage: BLAS dependency warning
Hi, I am having trouble using the BLAS libs with the MLLib functions. I am using org.apache.spark.mllib.clustering.KMeans (on a single machine) and running the Spark-shell with the kmeans example code (from https://spark.apache.org/docs/latest/mllib-clustering.html) which runs successfully but I get the following warning in the log: WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS I compiled spark 1.1.0 with mvn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Pnetlib-lgpl -DskipTests clean package If anyone could please clarify the steps to get the dependencies correctly installed and visible to spark (from https://spark.apache.org/docs/latest/mllib-guide.html), that would be greatly appreciated. Using yum, I installed blas.x86_64, lapack.x86_64, gcc-gfortran.x86_64, libgfortran.x86_64 and then downloaded Breeze and built that successfully with Maven. I verified that I do have /usr/lib/libblas.so.3 and /usr/lib/liblapack.so.3 present on the machine and ldconf -p shows these listed. I also tried adding /usr/lib/ to spark.executor.extraLibraryPath and I verified it is present in the Spark webUI environment tab. I downloaded and compiled jblas with mvn clean install, which creates jblas-1.2.4-SNAPSHOT.jar, and then also tried adding that to spark.executor.extraClassPath but I still get the same WARN message. Maybe there are a few simple steps that I am missing? Thanks a lot. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: MLLIB usage: BLAS dependency warning
Could you try jar tf on the assembly jar and grep netlib-native_system-linux-x86_64.so? -Xiangrui On Tue, Nov 11, 2014 at 7:11 PM, jpl jlefe...@soe.ucsc.edu wrote: Hi, I am having trouble using the BLAS libs with the MLLib functions. I am using org.apache.spark.mllib.clustering.KMeans (on a single machine) and running the Spark-shell with the kmeans example code (from https://spark.apache.org/docs/latest/mllib-clustering.html) which runs successfully but I get the following warning in the log: WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS I compiled spark 1.1.0 with mvn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Pnetlib-lgpl -DskipTests clean package If anyone could please clarify the steps to get the dependencies correctly installed and visible to spark (from https://spark.apache.org/docs/latest/mllib-guide.html), that would be greatly appreciated. Using yum, I installed blas.x86_64, lapack.x86_64, gcc-gfortran.x86_64, libgfortran.x86_64 and then downloaded Breeze and built that successfully with Maven. I verified that I do have /usr/lib/libblas.so.3 and /usr/lib/liblapack.so.3 present on the machine and ldconf -p shows these listed. I also tried adding /usr/lib/ to spark.executor.extraLibraryPath and I verified it is present in the Spark webUI environment tab. I downloaded and compiled jblas with mvn clean install, which creates jblas-1.2.4-SNAPSHOT.jar, and then also tried adding that to spark.executor.extraClassPath but I still get the same WARN message. Maybe there are a few simple steps that I am missing? Thanks a lot. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MLLIB-usage-BLAS-dependency-warning-tp18660.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org