Ah crud, I guess you are right, I am using the sbt I installed manually with my Scala installation.

Well, here is what you can do:
mkdir ~/bin
cd ~/bin
wget http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar
vi sbt

Put the following contents into your new file:

SBT_OPTS="-Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled 
-XX:MaxPermSize=256M"
java $SBT_OPTS -jar `dirname $0`/sbt-launch.jar "$@"

:wq!

chmod u+x sbt

Now you can do ~/bin/sbt compile ~/bin/sbt package, run etc.

Ognen

On 3/24/14, 3:30 PM, Diana Carroll wrote:
Yeah, that's exactly what I did. Unfortunately it doesn't work:

$SPARK_HOME/sbt/sbt package
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or directory /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or directory Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please install sbt manually from http://www.scala-sbt.org/



On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski <og...@plainvanillagames.com <mailto:og...@plainvanillagames.com>> wrote:

    You can use any sbt on your machine, including the one that comes
    with spark. For example, try:

    ~/path_to_spark/sbt/sbt compile
    ~/path_to_spark/sbt/sbt run <arguments>

    Or you can just add that to your PATH by:

    export $PATH=$PATH:~/path_to_spark/sbt

    To make it permanent, you can add it to your ~/.bashrc or
    ~/.bash_profile or ??? depending on the system you are using. If
    you are on Windows, sorry, I can't offer any help there ;)

    Ognen


    On 3/24/14, 3:16 PM, Diana Carroll wrote:
    Thanks Ongen.

    Unfortunately I'm not able to follow your instructions either.
     In particular:


        sbt compile
        sbt run <arguments if any>


    This doesn't work for me because there's no program on my path
    called "sbt".  The instructions in the Quick Start guide are
    specific that I should call "$SPARK_HOME/sbt/sbt".  I don't have
    any other executable on my system called "sbt".

    Did you download and install sbt separately?  In following the
    Quick Start guide, that was not stated as a requirement, and I'm
    trying to run through the guide word for word.

    Diana


    On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
    <og...@plainvanillagames.com
    <mailto:og...@plainvanillagames.com>> wrote:

        Diana,

        Anywhere on the filesystem you have read/write access (you
        need not be in your spark home directory):

        mkdir myproject
        cd myproject
        mkdir project
        mkdir target
        mkdir -p src/main/scala
        cp $mypath/$mymysource.scala src/main/scala/
        cp $mypath/myproject.sbt .

        Make sure that myproject.sbt has the following in it:

        name := "I NEED A NAME!"

        version := "I NEED A VERSION!"

        scalaVersion := "2.10.3"

        libraryDependencies += "org.apache.spark" % "spark-core_2.10"
        % "0.9.0-incubating"

        If you will be using Hadoop/HDFS functionality you will need
        the below line also

        libraryDependencies += "org.apache.hadoop" % "hadoop-client"
        % "2.2.0"

        The above assumes you are using Spark 0.9 and Scala 2.10.3.
        If you are using 0.8.1 - adjust appropriately.

        That's it. Now you can do

        sbt compile
        sbt run <arguments if any>

        You can also do
        sbt package to produce a jar file of your code which you can
        then add to the SparkContext at runtime.

        In a more complicated project you may need to have a bit more
        involved hierarchy like com.github.dianacarroll which will
        then translate to src/main/scala/com/github/dianacarroll/
        where you can put your multiple .scala files which will then
        have to be a part of a package com.github.dianacarroll (you
        can just put that as your first line in each of these scala
        files). I am new to Java/Scala so this is how I do it. More
        educated Java/Scala programmers may tell you otherwise ;)

        You can get more complicated with the sbt project
        subrirectory but you can read independently about sbt and
        what it can do, above is the bare minimum.

        Let me know if that helped.
        Ognen


        On 3/24/14, 2:44 PM, Diana Carroll wrote:

            Has anyone successfully followed the instructions on the
            Quick Start page of the Spark home page to run a
            "standalone" Scala application?  I can't, and I figure I
            must be missing something obvious!

            I'm trying to follow the instructions here as close to
            "word for word" as possible:
            
http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

            1.  The instructions don't say what directory to create
            my test application in, but later I'm instructed to run
            "sbt/sbt" so I conclude that my working directory must be
            $SPARK_HOME.  (Temporarily ignoring that it is a little
            weird to be working directly in the Spark distro.)

            2.  Create
            $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
             Copy&paste in the code from the instructions exactly,
            replacing YOUR_SPARK_HOME with my spark home path.

            3.  Create $SPARK_HOME/mysparktest/simple.sbt.
             Copy&paste in the sbt file from the instructions

            4.  From the $SPARK_HOME I run "sbt/sbt package".  It
            runs through the ENTIRE Spark project!  This takes
            several minutes, and at the end, it says "Done
            packaging".  unfortunately, there's nothing in the
            $SPARK_HOME/mysparktest/ folder other than what I already
            had there.

            (Just for fun, I also did what I thought was more
            logical, which is set my working directory to
            $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
            package, but that was even less successful: I got an error:
            awk: cmd. line:1: fatal: cannot open file
            `./project/build.properties' for reading (No such file or
            directory)
            Attempting to fetch sbt
            /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No
            such file or directory
            /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No
            such file or directory
            Our attempt to download sbt locally to
            sbt/sbt-launch-.jar failed. Please install sbt manually
            from http://www.scala-sbt.org/


            So, help?  I'm sure these instructions work because
            people are following them every day, but I can't tell
            what they are supposed to do.

            Thanks!
            Diana




-- "A distributed system is one in which the failure of a computer you didn't even know existed can render your own computer unusable"
    -- Leslie Lamport



--
"A distributed system is one in which the failure of a computer you didn't even know 
existed can render your own computer unusable"
-- Leslie Lamport

Reply via email to