Hello,

I am attempting to install Spark 1.0.1 on a windows machine but I've been
running into some difficulties.  When I attempt to run some examples I am
always met with the same response: Failed to find Spark assembly JAR.  You
need to build Spark with sbt\sbt assembly before running this program.  

I found this odd as I had downloaded the very latest Spark 1.0.1 which
supposedly comes pre-built and does not require building with sbt.  Then I
tried running sbt and I’m met with: 'sbt' is not recognized as an internal
or external command, operable program or batch file.  Am I approaching this
incorrectly?  Do the pre-built versions only run on UNIX?

Thank you for your time,
-Colin Taylor




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Installing-Spark-1-0-1-tp10990.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to