Re: Unable to run Spark application

2015-04-01 Thread Marcelo Vanzin
Try sbt assembly instead.

On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan kvi...@vt.edu wrote:
 Why do I get
 Failed to find Spark assembly JAR.
 You need to build Spark before running this program. ?

 I downloaded spark-1.2.1.tgz from the downloads page and extracted it.
 When I do sbt package inside my application, it worked fine. But when I
 try to run my application, I get the above mentioned error.





-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Unable to run Spark application

2015-04-01 Thread Vijayasarathy Kannan
That is failing too, with

sbt.resolveexception: unresolved
dependency:org.apache.spark#spark-network-common_2.10;1.2.1



On Wed, Apr 1, 2015 at 1:24 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Try sbt assembly instead.

 On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan kvi...@vt.edu
 wrote:
  Why do I get
  Failed to find Spark assembly JAR.
  You need to build Spark before running this program. ?
 
  I downloaded spark-1.2.1.tgz from the downloads page and extracted it.
  When I do sbt package inside my application, it worked fine. But when I
  try to run my application, I get the above mentioned error.
 
 



 --
 Marcelo



Re: Unable to run Spark application

2015-04-01 Thread Vijayasarathy Kannan
Managed to make sbt assembly work.

I run into another issue now. When I do ./sbin/start-all.sh, the script
fails saying JAVA_HOME is not set although I have explicitly set that
variable to point to the correct Java location. Same happens with
./sbin/start-master.sh script. Any idea what I might be missing?

On Wed, Apr 1, 2015 at 1:32 PM, Vijayasarathy Kannan kvi...@vt.edu wrote:

 That is failing too, with

 sbt.resolveexception: unresolved
 dependency:org.apache.spark#spark-network-common_2.10;1.2.1



 On Wed, Apr 1, 2015 at 1:24 PM, Marcelo Vanzin van...@cloudera.com
 wrote:

 Try sbt assembly instead.

 On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan kvi...@vt.edu
 wrote:
  Why do I get
  Failed to find Spark assembly JAR.
  You need to build Spark before running this program. ?
 
  I downloaded spark-1.2.1.tgz from the downloads page and extracted it.
  When I do sbt package inside my application, it worked fine. But when
 I
  try to run my application, I get the above mentioned error.
 
 



 --
 Marcelo





Re: Unable to run Spark application

2015-04-01 Thread Isaque Alves
You could try to run the 'make-distribution.sh' script with the proper
options for your case. The script is in the extracted spark-1.2.1 folder.