Re: Unable to run Spark application

2015-04-01 Thread Vijayasarathy Kannan
Managed to make "sbt assembly" work. I run into another issue now. When I do "./sbin/start-all.sh", the script fails saying JAVA_HOME is not set although I have explicitly set that variable to point to the correct Java location. Same happens with "./sbin/start-master.sh" script. Any idea what I mi

Re: Unable to run Spark application

2015-04-01 Thread Isaque Alves
You could try to run the 'make-distribution.sh' script with the proper options for your case. The script is in the extracted spark-1.2.1 folder.

Re: Unable to run Spark application

2015-04-01 Thread Vijayasarathy Kannan
That is failing too, with "sbt.resolveexception: unresolved dependency:org.apache.spark#spark-network-common_2.10;1.2.1" On Wed, Apr 1, 2015 at 1:24 PM, Marcelo Vanzin wrote: > Try "sbt assembly" instead. > > On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan > wrote: > > Why do I get > >

Re: Unable to run Spark application

2015-04-01 Thread Marcelo Vanzin
Try "sbt assembly" instead. On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan wrote: > Why do I get > "Failed to find Spark assembly JAR. > You need to build Spark before running this program." ? > > I downloaded "spark-1.2.1.tgz" from the downloads page and extracted it. > When I do "sbt pac

Unable to run Spark application

2015-04-01 Thread Vijayasarathy Kannan
Why do I get "Failed to find Spark assembly JAR. You need to build Spark before running this program." ? I downloaded "spark-1.2.1.tgz" from the downloads page and extracted it. When I do "sbt package" inside my application, it worked fine. But when I try to run my application, I get the above men