try sbt clean command before build the app.
or delete .ivy2 ans .sbt folders(not a good methode). Then try to rebuild
the project.
On Thu, Jun 5, 2014 at 11:45 AM, Sean Owen so...@cloudera.com wrote:
I think this is SPARK-1949 again: https://github.com/apache/spark/pull/906
I think this
Hi Prabeesh/ Sean,
I tried both the steps you guys mentioned looks like its not able to
resolve it.
[warn] [NOT FOUND ]
org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
(131ms)
[warn] public: tried
[warn]
Hi All,
Now that the Spark Version 1.0.0 is release there should not be any problem
with the local jars.
Shrikars-MacBook-Pro:SimpleJob shrikar$ cat simple.sbt
name := Simple Project
version := 1.0
scalaVersion := 2.10.4
libraryDependencies ++= Seq(org.apache.spark %% spark-core % 1.0.0,
Hi Shrikar,
How did you build Spark 1.0.0-SNAPSHOT on your machine? My
understanding is that `sbt publishLocal` is not enough and you really
need `sbt assembly` instead. Give it a try and report back.
As to your build.sbt, upgrade Scala to 2.10.4 and org.apache.spark
%% spark-streaming %
Still the same error no change
Thanks,
Shrikar
On Fri, May 23, 2014 at 2:38 PM, Jacek Laskowski ja...@japila.pl wrote:
Hi Shrikar,
How did you build Spark 1.0.0-SNAPSHOT on your machine? My
understanding is that `sbt publishLocal` is not enough and you really
need `sbt assembly` instead.
Hi All,
I am trying to run the network count example as a seperate standalone job
and running into some issues.
Environment:
1) Mac Mavericks
2) Latest spark repo from Github.
I have a structure like this
Shrikars-MacBook-Pro:SimpleJob shrikar$ find .
.
./simple.sbt
./src
./src/main
How are you launching the application? sbt run ? spark-submit? local
mode or Spark standalone cluster? Are you packaging all your code into
a jar?
Looks to me that you seem to have spark classes in your execution
environment but missing some of Spark's dependencies.
TD
On Thu, May 22, 2014 at
I am running as sbt run. I am running it locally .
Thanks,
Shrikar
On Thu, May 22, 2014 at 3:53 PM, Tathagata Das
tathagata.das1...@gmail.comwrote:
How are you launching the application? sbt run ? spark-submit? local
mode or Spark standalone cluster? Are you packaging all your code into
a
How are you getting Spark with 1.0.0-SNAPSHOT through maven? Did you
publish Spark locally which allowed you to use it as a dependency?
This is a weird indeed. SBT should take care of all the dependencies of
spark.
In any case, you can try the last released Spark 0.9.1 and see if the
problem
Yes I did a sbt publish-local. Ok I will try with Spark 0.9.1.
Thanks,
Shrikar
On Thu, May 22, 2014 at 8:53 PM, Tathagata Das
tathagata.das1...@gmail.comwrote:
How are you getting Spark with 1.0.0-SNAPSHOT through maven? Did you
publish Spark locally which allowed you to use it as a
Try cleaning your maven (.m2) and ivy cache.
On May 23, 2014, at 12:03 AM, Shrikar archak shrika...@gmail.com wrote:
Yes I did a sbt publish-local. Ok I will try with Spark 0.9.1.
Thanks,
Shrikar
On Thu, May 22, 2014 at 8:53 PM, Tathagata Das tathagata.das1...@gmail.com
wrote:
Hi,
I tried clearing maven and ivy cache and I am a bit confused at this point
in time.
1) Running the example from the spark directory and running using
bin/run-example. It works fine as well as it prints the word counts.
2) Trying to run the same code as a seperate job.
*) Using the latest
12 matches
Mail list logo