Re: Running a spark-submit compatible app in spark-shell

2014-06-04 Thread Roger Hoover
It took me a little while to get back to this but it works now!! I'm invoking the shell like this: spark-shell --jars target/scala-2.10/spark-etl_2.10-1.0.jar Once inside, I can invoke a method in my package to run the job. > val reseult = etl.IP2IncomeJob.job(sc) On Tue, May 27, 2014 at 8:42

Re: Running a spark-submit compatible app in spark-shell

2014-05-27 Thread Roger Hoover
Thanks, Andrew. I'll give it a try. On Mon, May 26, 2014 at 2:22 PM, Andrew Or wrote: > Hi Roger, > > This was due to a bug in the Spark shell code, and is fixed in the latest > master (and RC11). Here is the commit that fixed it: > https://github.com/apache/spark/commit/8edbee7d1b4afc192d97ba

Re: Running a spark-submit compatible app in spark-shell

2014-05-26 Thread Andrew Or
Hi Roger, This was due to a bug in the Spark shell code, and is fixed in the latest master (and RC11). Here is the commit that fixed it: https://github.com/apache/spark/commit/8edbee7d1b4afc192d97ba192a5526affc464205. Try it now and it should work. :) Andrew 2014-05-26 10:35 GMT+02:00 Perttu Ra

Re: Running a spark-submit compatible app in spark-shell

2014-05-26 Thread Perttu Ranta-aho
Hi Roger, Were you able to solve this? -Perttu On Tue, Apr 29, 2014 at 8:11 AM, Roger Hoover wrote: > Patrick, > > Thank you for replying. That didn't seem to work either. I see the > option parsed using verbose mode. > > Parsed arguments: > ... > driverExtraClassPath > /Users/rhoover/Wo

Re: Running a spark-submit compatible app in spark-shell

2014-04-28 Thread Roger Hoover
Patrick, Thank you for replying. That didn't seem to work either. I see the option parsed using verbose mode. Parsed arguments: ... driverExtraClassPath /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar But the jar still doesn't show up if I run ":cp" in the repl and t

Re: Running a spark-submit compatible app in spark-shell

2014-04-28 Thread Patrick Wendell
What about if you run ./bin/spark-shell --driver-class-path=/path/to/your/jar.jar I think either this or the --jars flag should work, but it's possible there is a bug with the --jars flag when calling the Repl. On Mon, Apr 28, 2014 at 4:30 PM, Roger Hoover wrote: > A couple of issues: > 1) the

Re: Running a spark-submit compatible app in spark-shell

2014-04-28 Thread Roger Hoover
A couple of issues: 1) the jar doesn't show up on the classpath even though SparkSubmit had it in the --jars options. I tested this by running > :cp in spark-shell 2) After adding it the classpath using (:cp /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar), it still fails.

Re: Running a spark-submit compatible app in spark-shell

2014-04-28 Thread Roger Hoover
Matei, thank you. That seemed to work but I'm not able to import a class from my jar. Using the verbose options, I can see that my jar should be included Parsed arguments: ... jars /Users/rhoover/Work/spark-etl/target/scala-2.10/spark-etl_2.10-1.0.jar And I see the class I want to load in t

Re: Running a spark-submit compatible app in spark-shell

2014-04-27 Thread Matei Zaharia
Hi Roger, You should be able to use the --jars argument of spark-shell to add JARs onto the classpath and then work with those classes in the shell. (A recent patch, https://github.com/apache/spark/pull/542, made spark-shell use the same command-line arguments as spark-submit). But this is a gr

Running a spark-submit compatible app in spark-shell

2014-04-27 Thread Roger Hoover
Hi, >From the meetup talk about the 1.0 release, I saw that spark-submit will be the preferred way to launch apps going forward. How do you recommend launching such jobs in a development cycle? For example, how can I load an app that's expecting to a given to spark-submit into spark-shell? Also