Try building Spark with -Pnetlib-lgpl, which includes the JNI library
in the Spark assembly jar. This is the simplest approach. If you want
to include it as part of your project, make sure the library is inside
the assembly jar or you specify it via `--jars` with spark-submit.
-Xiangrui

On Mon, Nov 24, 2014 at 8:51 AM, agg212 <alexander_galaka...@brown.edu> wrote:
> Hi, i'm trying to improve performance for Spark's Mllib, and I am having
> trouble getting native netlib-java libraries installed/recognized by Spark.
> I am running on a single machine, Ubuntu 14.04 and here is what I've tried:
>
> sudo apt-get install libgfortran3
> sudo apt-get install libatlas3-base libopenblas-base (this is how
> netlib-java's website says to install it)
>
> I also double checked and it looks like the libraries are linked correctly
> in /usr/lib (see below):
> /usr/lib/libblas.so.3 -> /etc/alternatives/libblas.so.3
> /usr/lib/liblapack.so.3 -> /etc/alternatives/liblapack.so.3
>
>
> The "Dependencies" section on Spark's Mllib website also says to include
> "com.github.fommil.netlib:all:1.1.2" as a dependency.  I therefore tried
> adding this to my sbt file like so:
>
> libraryDependencies += "com.github.fommil.netlib" % "all" % "1.1.2"
>
> After all this, i'm still seeing the following error message.  Does anyone
> have more detailed installation instructions?
>
> 14/11/24 16:49:29 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemBLAS
> 14/11/24 16:49:29 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefBLAS
>
> Thanks!
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Mllib-native-netlib-java-OpenBLAS-tp19662.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to