Hi Xiangrui, thank you very much for your response. I looked for the .so as
you suggested.
It is not here:
$ jar tf
assembly/target/spark-assembly_2.10-1.1.0-dist/spark-assembly-1.1.0-hadoop2.4.0.jar
| grep netlib-native_system-linux-x86_64.so
or here:
$ jar tf
Thanks! I used sbt (command below) and the .so file is now there (shown
below). Now that I have this new assembly.jar, how do I run the spark-shell
so that it can see the .so file when I call the kmeans function? Thanks
again for your help with this.
sbt/sbt -Dhadoop.version=2.4.0 -Pyarn
Hi Xiangrui,
All is well. Got it working now, I just recompiled with sbt with the
additional package flag and that created all the /bin files. Then when I
start spark-shell, the webUI environment show the assembly jar is in spark's
classpath entries and now the kmeans function finds it -- no
Hi,
I am having trouble using the BLAS libs with the MLLib functions. I am
using org.apache.spark.mllib.clustering.KMeans (on a single machine) and
running the Spark-shell with the kmeans example code (from
https://spark.apache.org/docs/latest/mllib-clustering.html) which runs
successfully but I