Hi,

One thing you can do is set the spark version your project depends on
to "1.0.0-SNAPSHOT" (make sure it matches the version of Spark you're
building); then before building your project, run "sbt publishLocal"
on the Spark tree.

On Wed, Apr 30, 2014 at 12:11 AM, wxhsdp <wxh...@gmail.com> wrote:
> i fixed it.
>
> i make my sbt project depend on
> spark/trunk/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar
> and it works
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-from-Spark-Java-tp4937p5096.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.



-- 
Marcelo

Reply via email to