I want to take advantage of the Breeze linear algebra libraries, built on
netlib-java, used heavily by SparkML. I've found this amazingly
time-consuming to figure out, and have only been able to do so on MacOS.  I
want to do same on Linux:

$ uname -a
Linux slc10whv 3.8.13-68.3.4.el6uek.x86_64 #2 SMP Tue Jul 14 15:03:36 PDT
2015 x86_64 x86_64 x86_64 GNU/Linux

This is for Spark 1.6.

For MacOS, I was able to find the *.jars in the .ivy2 cache and add them to
a combination of system and application classpaths.

For Linux, I've downloaded the Spark 1.6 source and compiled with sbt like
this:
sbt/sbt -Pyarn -DskipTests=true -Phadoop-2.3 -Dhadoop.version=2.6.0
-Pnetlib-lgpl clean update assembly package

This gives me 'spark-assembly-1.6.0-hadoop2.6.0.jar' that appears to contain
the *.so libs I need.  As an example:  netlib-native_ref-linux-x86_64.so

Now I want to compile and package my application so it picks these
netlib-java classes up at runtime.  Here's the command I'm using:

spark-submit --properties-file project-defaults.conf --class
"main.scala.SparkLDADemo" --jars
lib/stanford-corenlp-3.6.0.jar,lib/stanford-corenlp-3.6.0-models.jar,/scratch/cmcmulle/programs/spark/spark-1.6.0/assembly/target/scala-2.10/spark-assembly-1.6.0-hadoop2.6.0.jar
target/scala-2.10/sparksql-demo_2.10-1.0.jar

Still, I get the dreaded:
"16/03/02 16:49:21 WARN BLAS: Failed to load implementation from:
com.github.fommil.netlib.NativeSystemBLAS
16/03/02 16:49:21 WARN BLAS: Failed to load implementation from:
com.github.fommil.netlib.NativeRefBLAS"

Can someone please tell me how to build/configure/run a standalone SparkML
application using spark-submit such that it is able to load/use the
netlib-java classes?

Thanks --



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Using-netlib-java-in-Spark-1-6-on-linux-tp26386.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to