Since the breeze jar is brought into spark by mllib package, you may want
to add mllib as your dependency in spark 1.0. For bring it from your
application yourself, you can either use sbt assembly in ur build project
to generate a flat myApp-assembly.jar which contains breeze jar, or use
spark add jar api like Yadid said.


Sincerely,

DB Tsai
-------------------------------------------------------
My Blog: https://www.dbtsai.com
LinkedIn: https://www.linkedin.com/in/dbtsai


On Sun, May 4, 2014 at 10:24 PM, wxhsdp <wxh...@gmail.com> wrote:

> Hi, DB, i think it's something related to "sbt publishLocal"
>
> if i remove the breeze dependency in my sbt file, breeze can not be found
>
> [error] /home/wxhsdp/spark/example/test/src/main/scala/test.scala:5: not
> found: object breeze
> [error] import breeze.linalg._
> [error]        ^
>
> here's my sbt file:
>
> name := "Build Project"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> libraryDependencies += "org.apache.spark" %% "spark-core" %
> "1.0.0-SNAPSHOT"
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>
> i run "sbt publishLocal" on the Spark tree.
>
> but if i manully put spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar in /lib
> directory, sbt package is
> ok, i can run my app in workers without addJar
>
> what's the difference between add dependency in sbt after "sbt
> publishLocal"
> and manully put spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar in /lib
> directory?
>
> why can i run my app in worker without addJar this time?
>
>
> DB Tsai-2 wrote
> > If you add the breeze dependency in your build.sbt project, it will not
> be
> > available to all the workers.
> >
> > There are couple options, 1) use sbt assembly to package breeze into your
> > application jar. 2) manually copy breeze jar into all the nodes, and have
> > them in the classpath. 3) spark 1.0 has breeze jar in the spark flat
> > assembly jar, so you don't need to add breeze dependency yourself.
> >
> >
> > Sincerely,
> >
> > DB Tsai
> > -------------------------------------------------------
> > My Blog: https://www.dbtsai.com
> > LinkedIn: https://www.linkedin.com/in/dbtsai
> >
> >
> > On Sun, May 4, 2014 at 4:07 AM, wxhsdp &lt;
>
> > wxhsdp@
>
> > &gt; wrote:
> >
> >> Hi,
> >>   i'am trying to use breeze linalg library for matrix operation in my
> >> spark
> >> code. i already add dependency
> >>   on breeze in my build.sbt, and package my code sucessfully.
> >>
> >>   when i run on local mode, sbt "run local...", everything is ok
> >>
> >>   but when turn to standalone mode, sbt "run spark://127.0.0.1:7077
> ...",
> >> error occurs
> >>
> >> 14/05/04 18:56:29 WARN scheduler.TaskSetManager: Loss was due to
> >> java.lang.NoSuchMethodError
> >> java.lang.NoSuchMethodError:
> >>
> >>
> breeze.linalg.DenseMatrix$.implOpMulMatrix_DMD_DMD_eq_DMD()Lbreeze/linalg/operators/DenseMatrixMultiplyStuff$implOpMulMatrix_DMD_DMD_eq_DMD$;
> >>
> >>   in my opinion, everything needed is packaged to the jar file, isn't
> it?
> >>   and does anyone used breeze before? is it good for matrix operation?
> >>
> >>
> >>
> >> --
> >> View this message in context:
> >>
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310.html
> >> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >>
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-breeze-linalg-DenseMatrix-tp5310p5355.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to