What do you mean by "the last version of spark-0.9.0"?  To be precise,
there isn't anything known as spark-0.9.0.  What was released recently is
spark-0.9.0-incubating, and there is and only ever will be one version of
that.  If you're talking about a 0.9.0-incubating-SNAPSHOT built locally,
then you're going to have to specify a commit number for us to know just
what you've built -- that's the basic, floating nature of SNAPSHOTs, and it
is even more true right now because the master branch of Spark currently
says that it is building 0.9.0-incubating-SNAPSHOT when it should be
1.0.0-incubating-SNAPSHOT.

If you're not building Spark locally, then it is a matter of getting the
right resolver set in simple.sbt.  If you are re-building Spark (e.g. to
change the Hadoop version), then make sure that you are doing `sbt/sbt
publish-local` after your build to put your newly-built artifacts into your
.ivy2 cache where other sbt projects can find it.


On Wed, Feb 5, 2014 at 10:40 AM, Dana Tontea <d...@cylex.ro> wrote:

>    Hi Matei,
>
> Firstly thank you a lot for answer.You are right I'm missing on local the
> hadoop-client dependency.
> But in my cluster I deployed the last version of spark-0.9.0 and now on
> same
> code I get the next error to sbt package:
>
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]  ::          UNRESOLVED DEPENDENCIES         ::
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]  :: org.apache.spark#spark-core_2.10.3;0.9.0-incubating: not found
> [warn]  ::::::::::::::::::::::::::::::::::::::::::::::
> [error]
>
> {file:/root/workspace_Spark/scala%20standalone%20app/}default-2327b2/*:update:
> sbt.ResolveException: unresolved dependency:
> org.apache.spark#spark-core_2.10.3;0.9.0-incubating: not found
> [error] Total time: 12 s, completed Feb 5, 2014 8:12:25 PM
> I don't know what  I am missing again...
> My scala -version is:
> Scala code runner version 2.10.3 -- Copyright 2002-2013, LAMP/EPFL
>
> Thanks in advanced!
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/What-I-am-missing-from-configuration-tp878p1246.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to