I used spark-submit to run the MovieLensALS example from the examples
module.
here is the command:
$spark-submit --master local
/home/phoenix/spark/spark-dev/examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop1.0.4.jar
--class org.apache.spark.examples.mllib.MovieLensALS u.data
also,
Hi Stephen,
I am using maven shade plugin for creating my uber jar. I have marked spark
dependencies as provided.
Best Regards,
Sonal
Nube Technologies http://www.nubetech.co
http://in.linkedin.com/in/sonalgoyal
On Mon, May 12, 2014 at 1:04 AM, Stephen Boesch java...@gmail.com wrote:
HI
@Sonal - makes sense. Is the maven shade plugin runnable within sbt ? If
so would you care to share those build.sbt (or .scala) lines? If not, are
you aware of a similar plugin for sbt?
2014-05-11 23:53 GMT-07:00 Sonal Goyal sonalgoy...@gmail.com:
Hi Stephen,
I am using maven shade
Will sbt-pack and the maven solution work for the Scala REPL?
I need the REPL because it save a lot of time when I'm playing with large data
sets because I load then once, cache them and then try out things interactively
before putting in a standalone driver.
I've sbt woking for my own
HI Sonal,
Yes I am working towards that same idea. How did you go about creating
the non-spark-jar dependencies ? The way I am doing it is a separate
straw-man project that does not include spark but has the external third
party jars included. Then running sbt compile:managedClasspath and
Doesnt the run-example script work for you? Also, are you on the latest
commit of branch-1.0 ?
TD
On Mon, May 5, 2014 at 7:51 PM, Soumya Simanta soumya.sima...@gmail.comwrote:
Yes, I'm struggling with a similar problem where my class are not found on
the worker nodes. I'm using
Yes, I'm struggling with a similar problem where my class are not found on the
worker nodes. I'm using 1.0.0_SNAPSHOT. I would really appreciate if someone
can provide some documentation on the usage of spark-submit.
Thanks
On May 5, 2014, at 10:24 PM, Stephen Boesch java...@gmail.com