It doesn't work if you put the netlib-native jar inside an assembly
jar. Try to mark it "provided" in the dependencies, and use --jars to
include them with spark-submit. -Xiangrui

On Wed, May 14, 2014 at 6:12 PM, wxhsdp <wxh...@gmail.com> wrote:
> Hi, DB
>
>   i tried including breeze library by using spark 1.0, it works. but how can
> i call
>   the native library in standalone cluster mode.
>
>   in local mode
>   1. i include "org.scalanlp" % "breeze-natives_2.10" % "0.7" dependency in
> sbt build file
>   2. i install openblas
>   it works
>
>   in standalone mode
>   1. i include "org.scalanlp" % "breeze-natives_2.10" % "0.7" dependency in
> sbt build file
>   2. install openblas in workers
>   3. add sepatate jars using sc.addJar(). jars: breeze-natives_2.10-0.7.jar,
> netlib-native_ref-linux-
>       x86_64-1.1-natives.jar,
> netlib-native_system-linux-x86_64-1.1-natives.jar
>   4. i also include classpath of the above jars
>   but does not work:(
>
>
>
> DB Tsai-2 wrote
>> Hi Wxhsdp,
>>
>> I also have some difficulties witth "sc.addJar()". Since we include the
>> breeze library by using Spark 1.0, we don't have the problem you ran into.
>> However, when we add external jars via sc.addJar(), I found that the
>> executors actually fetch the jars but the classloader still doesn't honor
>> it. I'm trying to figure out the problem now.
>>
>>
>> Sincerely,
>>
>> DB Tsai
>> -------------------------------------------------------
>> My Blog: https://www.dbtsai.com
>> LinkedIn: https://www.linkedin.com/in/dbtsai
>>
>>
>> On Wed, May 14, 2014 at 5:46 AM, wxhsdp &lt;
>
>> wxhsdp@
>
>> &gt; wrote:
>>
>>> Hi, DB
>>>   i've add breeze jars to workers using sc.addJar()
>>>   breeze jars include :
>>>   breeze-natives_2.10-0.7.jar
>>>   breeze-macros_2.10-0.3.jar
>>>   breeze-macros_2.10-0.3.1.jar
>>>   breeze_2.10-0.8-SNAPSHOT.jar
>>>   breeze_2.10-0.7.jar
>>>
>>>   almost all the jars about breeze i can find, but still
>>> NoSuchMethodError:
>>> breeze.linalg.DenseMatrix
>>>
>>>   from the executor stderr, you can see the executor successsully fetches
>>> these jars, what's wrong
>>>   about my method? thank you!
>>>
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/breeze-natives_2.10-0.7.jar with
>>> timestamp
>>> 1400070957376
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/breeze-natives_2.10-0.7.jar to
>>> /tmp/fetchFileTemp7468892065227766972.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./breeze-natives_2.10-0.7.jar
>>> to class loader
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/breeze-macros_2.10-0.3.jar with timestamp
>>> 1400070957441
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/breeze-macros_2.10-0.3.jar to
>>> /tmp/fetchFileTemp2324565598765584917.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./breeze-macros_2.10-0.3.jar
>>> to class loader
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/breeze_2.10-0.8-SNAPSHOT.jar with
>>> timestamp
>>> 1400070957358
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/breeze_2.10-0.8-SNAPSHOT.jar to
>>> /tmp/fetchFileTemp8730123100104850193.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./breeze_2.10-0.8-SNAPSHOT.jar
>>> to class loader
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/breeze-macros_2.10-0.3.1.jar with
>>> timestamp
>>> 1400070957414
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/breeze-macros_2.10-0.3.1.jar to
>>> /tmp/fetchFileTemp3473404556989515218.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./breeze-macros_2.10-0.3.1.jar
>>> to class loader
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/build-project_2.10-1.0.jar with timestamp
>>> 1400070956753
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/build-project_2.10-1.0.jar to
>>> /tmp/fetchFileTemp1289055585501269156.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./build-project_2.10-1.0.jar
>>> to class loader
>>> 14/05/14 20:36:02 INFO Executor: Fetching
>>> http://192.168.0.106:42883/jars/breeze_2.10-0.7.jar with timestamp
>>> 1400070957228
>>> 14/05/14 20:36:02 INFO Utils: Fetching
>>> http://192.168.0.106:42883/jars/breeze_2.10-0.7.jar to
>>> /tmp/fetchFileTemp1287317286108432726.tmp
>>> 14/05/14 20:36:02 INFO Executor: Adding
>>>
>>> file:/home/wxhsdp/spark/spark/tags/v1.0.0-rc3/work/app-20140514203557-0000/0/./breeze_2.10-0.7.jar
>>> to class loader
>>>
>>>
>>> DB Tsai-2 wrote
>>> > Since the breeze jar is brought into spark by mllib package, you may
>>> want
>>> > to add mllib as your dependency in spark 1.0. For bring it from your
>>> > application yourself, you can either use sbt assembly in ur build
>>> project
>>> > to generate a flat myApp-assembly.jar which contains breeze jar, or use
>>> > spark add jar api like Yadid said.
>>> >
>>> >
>>> > Sincerely,
>>> >
>>> > DB Tsai
>>> > -------------------------------------------------------
>>> > My Blog: https://www.dbtsai.com
>>> > LinkedIn: https://www.linkedin.com/in/dbtsai
>>> >
>>> >
>>> > On Sun, May 4, 2014 at 10:24 PM, wxhsdp &lt;
>>>
>>> > wxhsdp@
>>>
>>> > &gt; wrote:
>>> >
>>> >> Hi, DB, i think it's something related to "sbt publishLocal"
>>> >>
>>> >> if i remove the breeze dependency in my sbt file, breeze can not be
>>> found
>>> >>
>>> >> [error] /home/wxhsdp/spark/example/test/src/main/scala/test.scala:5:
>>> not
>>> >> found: object breeze
>>> >> [error] import breeze.linalg._
>>> >> [error]        ^
>>> >>
>>> >> here's my sbt file:
>>> >>
>>> >> name := "Build Project"
>>> >>
>>> >> version := "1.0"
>>> >>
>>> >> scalaVersion := "2.10.4"
>>> >>
>>> >> libraryDependencies += "org.apache.spark" %% "spark-core" %
>>> >> "1.0.0-SNAPSHOT"
>>> >>
>>> >> resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>>> >>
>>> >> i run "sbt publishLocal" on the Spark tree.
>>> >>
>>> >> but if i manully put spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar in
>>> >> /lib
>>> >> directory, sbt package is
>>> >> ok, i can run my app in workers without addJar
>>> >>
>>> >> what's the difference between add dependency in sbt after "sbt
>>> >> publishLocal"
>>> >> and manully put spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar in /lib
>>> >> directory?
>>> >>
>>> >> why can i run my app in worker without addJar this time?
>>> >>
>>> >>
>>> >> DB Tsai-2 wrote
>>> >> > If you add the breeze dependency in your build.sbt project, it will
>>> not
>>> >> be
>>> >> > available to all the workers.
>>> >> >
>>> >> > There are couple options, 1) use sbt assembly to package breeze into
>>> >> your
>>> >> > application jar. 2) manually copy breeze jar into all the nodes, and
>>> >> have
>>> >> > them in the classpath. 3) spark 1.0 has breeze jar in the spark flat
>>> >> > assembly jar, so you don't need to add breeze dependency yourself.
>>> >> >
>>> >> >
>>> >> > Sincerely,
>>> >> >
>>> >> > DB Tsai
>>> >> > -------------------------------------------------------
>>> >> > My Blog: https://www.dbtsai.com
>>> >> > LinkedIn: https://www.linkedin.com/in/dbtsai
>>> >> >
>>> >> >
>>> >> > On Sun, May 4, 2014 at 4:07 AM, wxhsdp &lt;
>>> >>
>>> >> > wxhsdp@
>>> >>
>>> >> > &gt; wrote:
>>> >> >
>>> >> >> Hi,
>>> >> >>   i'am trying to use breeze linalg library for matrix operation in
>>> my
>>> >> >> spark
>>> >> >> code. i already add dependency
>>> >> >>   on breeze in my build.sbt, and package my code sucessfully.
>>> >> >>
>>> >> >>   when i run on local mode, sbt "run local...", everything is ok
>>> >> >>
>>> >> >>   but when turn to standalone mode, sbt "run spark://127.0.0.1:7077
>>> >> ...",
>>> >> >> error occurs
>>> >> >>
>>> >> >> 14/05/04 18:56:29 WARN scheduler.TaskSetManager: Loss was due to
>>> >> >> java.lang.NoSuchMethodError
>>> >> >> java.lang.NoSuchMethodError:
>>> >> >>
>>> >> >>
>>> >>
>>> breeze.linalg.DenseMatrix$.implOpMulMatrix_DMD_DMD_eq_DMD()Lbreeze/linalg/operators/DenseMatrixMultiplyStuff$implOpMulMatrix_DMD_DMD_eq_DMD$;
>>> >> >>
>>> >> >>   in my opinion, everything needed is packaged to the jar file,
>>> isn't
>>> >> it?
>>> >> >>   and does anyone used breeze before? is it good for matrix
>>> operation?
>>> >> >>
>>> >> >>
>>> >> >>
>>> >> >> --
>>> >> >> View this message in context:
>>> >> >>
>>> >>
>>> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
>>> >> >> Sent from the Apache Spark User List mailing list archive at
>>> >> Nabble.com.
>>> >> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> --
>>> >> View this message in context:
>>> >>
>>> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5355.html
>>> >> Sent from the Apache Spark User List mailing list archive at
>>> Nabble.com.
>>> >>
>>>
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5704.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5730.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to