That is failing too, with "sbt.resolveexception: unresolved dependency:org.apache.spark#spark-network-common_2.10;1.2.1"
On Wed, Apr 1, 2015 at 1:24 PM, Marcelo Vanzin <van...@cloudera.com> wrote: > Try "sbt assembly" instead. > > On Wed, Apr 1, 2015 at 10:09 AM, Vijayasarathy Kannan <kvi...@vt.edu> > wrote: > > Why do I get > > "Failed to find Spark assembly JAR. > > You need to build Spark before running this program." ? > > > > I downloaded "spark-1.2.1.tgz" from the downloads page and extracted it. > > When I do "sbt package" inside my application, it worked fine. But when I > > try to run my application, I get the above mentioned error. > > > > > > > > -- > Marcelo >