Thanks, Nan Zhu.

You say that my problems are "because you are in Spark directory, don't
need to do that actually , the dependency on Spark is resolved by sbt"

I did try it initially in what I thought was a much more typical place,
e.g. ~/mywork/sparktest1.  But as I said in my email:

(Just for fun, I also did what I thought was more logical, which is set my
working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
package, but that was even less successful: I got an error:
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
install sbt manually from http://www.scala-sbt.org/



On Mon, Mar 24, 2014 at 4:00 PM, Nan Zhu <zhunanmcg...@gmail.com> wrote:

>  Hi, Diana,
>
> See my inlined answer
>
> --
> Nan Zhu
>
>
> On Monday, March 24, 2014 at 3:44 PM, Diana Carroll wrote:
>
> Has anyone successfully followed the instructions on the Quick Start page
> of the Spark home page to run a "standalone" Scala application?  I can't,
> and I figure I must be missing something obvious!
>
> I'm trying to follow the instructions here as close to "word for word" as
> possible:
>
> http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
>
> 1.  The instructions don't say what directory to create my test
> application in, but later I'm instructed to run "sbt/sbt" so I conclude
> that my working directory must be $SPARK_HOME.  (Temporarily ignoring that
> it is a little weird to be working directly in the Spark distro.)
>
>
> You can create your application in any directory, just follow the sbt
> project dir structure
>
>
> 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
>  Copy&paste in the code from the instructions exactly, replacing
> YOUR_SPARK_HOME with my spark home path.
>
>
> should be correct
>
>
> 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copy&paste in the sbt file
> from the instructions
>
>
> should be correct
>
>
> 4.  From the $SPARK_HOME I run "sbt/sbt package".  It runs through the
> ENTIRE Spark project!  This takes several minutes, and at the end, it says
> "Done packaging".  unfortunately, there's nothing in the
> $SPARK_HOME/mysparktest/ folder other than what I already had there.
>
>
> because you are in Spark directory, don't need to do that actually , the
> dependency on Spark is resolved by sbt
>
>
>
> (Just for fun, I also did what I thought was more logical, which is set my
> working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
> package, but that was even less successful: I got an error:
> awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
> reading (No such file or directory)
> Attempting to fetch sbt
> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> directory
> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> directory
> Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
> install sbt manually from http://www.scala-sbt.org/
>
>
> So, help?  I'm sure these instructions work because people are following
> them every day, but I can't tell what they are supposed to do.
>
> Thanks!
> Diana
>
>
>

Reply via email to