Re: quick start guide: building a standalone scala program

2014-09-25 Thread christy
I have encountered the same issue when I went through the tutorial first
standalone application. Then I tried to reinstall the stb but it doest help.

Then I follow this thread, create a workspace under spark directly and
execute ./sbt/sbt package, it says packing successfully. But how this
happen? How the sbt know which location specific?  

And though it went smoothly, I didn't see any jar had been created. 

Pls help.

Thanks,
Christy



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15120.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: quick start guide: building a standalone scala program

2014-09-25 Thread christy
I encountered exactly the same problem. How did you solve this?

Thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15125.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: quick start guide: building a standalone scala program

2014-09-25 Thread Andrew Ash
Hi Christy,

I'm more of a Gradle fan but I know SBT fits better into the Scala
ecosystem as a build tool.  If you'd like to give Gradle a shot try this
skeleton Gradle+Spark repo from my coworker Punya.

https://github.com/punya/spark-gradle-test-example

Good luck!
Andrew

On Thu, Sep 25, 2014 at 1:00 AM, christy 760948...@qq.com wrote:

 I encountered exactly the same problem. How did you solve this?

 Thanks



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15125.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Has anyone successfully followed the instructions on the Quick Start page
of the Spark home page to run a standalone Scala application?  I can't,
and I figure I must be missing something obvious!

I'm trying to follow the instructions here as close to word for word as
possible:
http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

1.  The instructions don't say what directory to create my test application
in, but later I'm instructed to run sbt/sbt so I conclude that my working
directory must be $SPARK_HOME.  (Temporarily ignoring that it is a little
weird to be working directly in the Spark distro.)

2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
 Copypaste in the code from the instructions exactly, replacing
YOUR_SPARK_HOME with my spark home path.

3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt file
from the instructions

4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
ENTIRE Spark project!  This takes several minutes, and at the end, it says
Done packaging.  unfortunately, there's nothing in the
$SPARK_HOME/mysparktest/ folder other than what I already had there.

(Just for fun, I also did what I thought was more logical, which is set my
working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
package, but that was even less successful: I got an error:
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
install sbt manually from http://www.scala-sbt.org/


So, help?  I'm sure these instructions work because people are following
them every day, but I can't tell what they are supposed to do.

Thanks!
Diana


Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Hi, Diana,   

See my inlined answer  

--  
Nan Zhu



On Monday, March 24, 2014 at 3:44 PM, Diana Carroll wrote:

 Has anyone successfully followed the instructions on the Quick Start page of 
 the Spark home page to run a standalone Scala application?  I can't, and I 
 figure I must be missing something obvious!
  
 I'm trying to follow the instructions here as close to word for word as 
 possible:
 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
  
 1.  The instructions don't say what directory to create my test application 
 in, but later I'm instructed to run sbt/sbt so I conclude that my working 
 directory must be $SPARK_HOME.  (Temporarily ignoring that it is a little 
 weird to be working directly in the Spark distro.)

You can create your application in any directory, just follow the sbt project 
dir structure
  
 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.  
 Copypaste in the code from the instructions exactly, replacing 
 YOUR_SPARK_HOME with my spark home path.

should be correct  
  
 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt file 
 from the instructions

should be correct   
  
 4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the ENTIRE 
 Spark project!  This takes several minutes, and at the end, it says Done 
 packaging.  unfortunately, there's nothing in the $SPARK_HOME/mysparktest/ 
 folder other than what I already had there.   

because you are in Spark directory, don’t need to do that actually , the 
dependency on Spark is resolved by sbt
  
  
 (Just for fun, I also did what I thought was more logical, which is set my 
 working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt 
 package, but that was even less successful: I got an error:  
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for 
 reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please 
 install sbt manually from http://www.scala-sbt.org/
  
  
  
 So, help?  I'm sure these instructions work because people are following them 
 every day, but I can't tell what they are supposed to do.   
  
 Thanks!  
 Diana
  



Re: quick start guide: building a standalone scala program

2014-03-24 Thread Yana Kadiyska
I am able to run standalone apps. I think you are making one mistake
that throws you off from there onwards. You don't need to put your app
under SPARK_HOME. I would create it in its own folder somewhere, it
follows the rules of any standalone scala program (including the
layout). In the giude, $SPARK_HOME is only relevant to find the Readme
file which they are parsing/word-counting. But otherwise the compile
time dependencies on spark would be resolved via the sbt file (or the
pom file if you look at the Java example).

So for example I put my app under C:\Source\spark-code and the jar
gets created in C:\Source\spark-code\target\scala-2.9.3 (or 2.10 if
you're running with scala 2.10 as the example shows). But for that
part of the guide, it's not any different than building a scala app.

On Mon, Mar 24, 2014 at 3:44 PM, Diana Carroll dcarr...@cloudera.com wrote:
 Has anyone successfully followed the instructions on the Quick Start page of
 the Spark home page to run a standalone Scala application?  I can't, and I
 figure I must be missing something obvious!

 I'm trying to follow the instructions here as close to word for word as
 possible:
 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

 1.  The instructions don't say what directory to create my test application
 in, but later I'm instructed to run sbt/sbt so I conclude that my working
 directory must be $SPARK_HOME.  (Temporarily ignoring that it is a little
 weird to be working directly in the Spark distro.)

 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
 Copypaste in the code from the instructions exactly, replacing
 YOUR_SPARK_HOME with my spark home path.

 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt file
 from the instructions

 4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
 ENTIRE Spark project!  This takes several minutes, and at the end, it says
 Done packaging.  unfortunately, there's nothing in the
 $SPARK_HOME/mysparktest/ folder other than what I already had there.

 (Just for fun, I also did what I thought was more logical, which is set my
 working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
 package, but that was even less successful: I got an error:
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
 reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
 install sbt manually from http://www.scala-sbt.org/


 So, help?  I'm sure these instructions work because people are following
 them every day, but I can't tell what they are supposed to do.

 Thanks!
 Diana



Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Yana: Thanks.  Can you give me a transcript of the actual commands you are
running?

THanks!
Diana


On Mon, Mar 24, 2014 at 3:59 PM, Yana Kadiyska yana.kadiy...@gmail.comwrote:

 I am able to run standalone apps. I think you are making one mistake
 that throws you off from there onwards. You don't need to put your app
 under SPARK_HOME. I would create it in its own folder somewhere, it
 follows the rules of any standalone scala program (including the
 layout). In the giude, $SPARK_HOME is only relevant to find the Readme
 file which they are parsing/word-counting. But otherwise the compile
 time dependencies on spark would be resolved via the sbt file (or the
 pom file if you look at the Java example).

 So for example I put my app under C:\Source\spark-code and the jar
 gets created in C:\Source\spark-code\target\scala-2.9.3 (or 2.10 if
 you're running with scala 2.10 as the example shows). But for that
 part of the guide, it's not any different than building a scala app.

 On Mon, Mar 24, 2014 at 3:44 PM, Diana Carroll dcarr...@cloudera.com
 wrote:
  Has anyone successfully followed the instructions on the Quick Start
 page of
  the Spark home page to run a standalone Scala application?  I can't,
 and I
  figure I must be missing something obvious!
 
  I'm trying to follow the instructions here as close to word for word as
  possible:
 
 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
 
  1.  The instructions don't say what directory to create my test
 application
  in, but later I'm instructed to run sbt/sbt so I conclude that my
 working
  directory must be $SPARK_HOME.  (Temporarily ignoring that it is a little
  weird to be working directly in the Spark distro.)
 
  2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
  Copypaste in the code from the instructions exactly, replacing
  YOUR_SPARK_HOME with my spark home path.
 
  3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt
 file
  from the instructions
 
  4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
  ENTIRE Spark project!  This takes several minutes, and at the end, it
 says
  Done packaging.  unfortunately, there's nothing in the
  $SPARK_HOME/mysparktest/ folder other than what I already had there.
 
  (Just for fun, I also did what I thought was more logical, which is set
 my
  working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
  package, but that was even less successful: I got an error:
  awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
 for
  reading (No such file or directory)
  Attempting to fetch sbt
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
  install sbt manually from http://www.scala-sbt.org/
 
 
  So, help?  I'm sure these instructions work because people are following
  them every day, but I can't tell what they are supposed to do.
 
  Thanks!
  Diana
 



Re: quick start guide: building a standalone scala program

2014-03-24 Thread Ognen Duzlevski

Diana,

Anywhere on the filesystem you have read/write access (you need not be 
in your spark home directory):


mkdir myproject
cd myproject
mkdir project
mkdir target
mkdir -p src/main/scala
cp $mypath/$mymysource.scala src/main/scala/
cp $mypath/myproject.sbt .

Make sure that myproject.sbt has the following in it:

name := I NEED A NAME!

version := I NEED A VERSION!

scalaVersion := 2.10.3

libraryDependencies += org.apache.spark % spark-core_2.10 % 
0.9.0-incubating


If you will be using Hadoop/HDFS functionality you will need the below 
line also


libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0

The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are 
using 0.8.1 - adjust appropriately.


That's it. Now you can do

sbt compile
sbt run arguments if any

You can also do
sbt package to produce a jar file of your code which you can then add to 
the SparkContext at runtime.


In a more complicated project you may need to have a bit more involved 
hierarchy like com.github.dianacarroll which will then translate to 
src/main/scala/com/github/dianacarroll/ where you can put your multiple 
.scala files which will then have to be a part of a package 
com.github.dianacarroll (you can just put that as your first line in 
each of these scala files). I am new to Java/Scala so this is how I do 
it. More educated Java/Scala programmers may tell you otherwise ;)


You can get more complicated with the sbt project subrirectory but you 
can read independently about sbt and what it can do, above is the bare 
minimum.


Let me know if that helped.
Ognen

On 3/24/14, 2:44 PM, Diana Carroll wrote:
Has anyone successfully followed the instructions on the Quick Start 
page of the Spark home page to run a standalone Scala application? 
 I can't, and I figure I must be missing something obvious!


I'm trying to follow the instructions here as close to word for word 
as possible:

http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

1.  The instructions don't say what directory to create my test 
application in, but later I'm instructed to run sbt/sbt so I 
conclude that my working directory must be $SPARK_HOME.  (Temporarily 
ignoring that it is a little weird to be working directly in the Spark 
distro.)


2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala. 
 Copypaste in the code from the instructions exactly, replacing 
YOUR_SPARK_HOME with my spark home path.


3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt 
file from the instructions


4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the 
ENTIRE Spark project!  This takes several minutes, and at the end, it 
says Done packaging.  unfortunately, there's nothing in the 
$SPARK_HOME/mysparktest/ folder other than what I already had there.


(Just for fun, I also did what I thought was more logical, which is 
set my working directory to $SPARK_HOME/mysparktest, and but 
$SPARK_HOME/sbt/sbt package, but that was even less successful: I got 
an error:
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' 
for reading (No such file or directory)

Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. 
Please install sbt manually from http://www.scala-sbt.org/



So, help?  I'm sure these instructions work because people are 
following them every day, but I can't tell what they are supposed to do.


Thanks!
Diana




Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks, Nan Zhu.

You say that my problems are because you are in Spark directory, don't
need to do that actually , the dependency on Spark is resolved by sbt

I did try it initially in what I thought was a much more typical place,
e.g. ~/mywork/sparktest1.  But as I said in my email:

(Just for fun, I also did what I thought was more logical, which is set my
working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
package, but that was even less successful: I got an error:
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
install sbt manually from http://www.scala-sbt.org/



On Mon, Mar 24, 2014 at 4:00 PM, Nan Zhu zhunanmcg...@gmail.com wrote:

  Hi, Diana,

 See my inlined answer

 --
 Nan Zhu


 On Monday, March 24, 2014 at 3:44 PM, Diana Carroll wrote:

 Has anyone successfully followed the instructions on the Quick Start page
 of the Spark home page to run a standalone Scala application?  I can't,
 and I figure I must be missing something obvious!

 I'm trying to follow the instructions here as close to word for word as
 possible:

 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

 1.  The instructions don't say what directory to create my test
 application in, but later I'm instructed to run sbt/sbt so I conclude
 that my working directory must be $SPARK_HOME.  (Temporarily ignoring that
 it is a little weird to be working directly in the Spark distro.)


 You can create your application in any directory, just follow the sbt
 project dir structure


 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
  Copypaste in the code from the instructions exactly, replacing
 YOUR_SPARK_HOME with my spark home path.


 should be correct


 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt file
 from the instructions


 should be correct


 4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
 ENTIRE Spark project!  This takes several minutes, and at the end, it says
 Done packaging.  unfortunately, there's nothing in the
 $SPARK_HOME/mysparktest/ folder other than what I already had there.


 because you are in Spark directory, don't need to do that actually , the
 dependency on Spark is resolved by sbt



 (Just for fun, I also did what I thought was more logical, which is set my
 working directory to $SPARK_HOME/mysparktest, and but $SPARK_HOME/sbt/sbt
 package, but that was even less successful: I got an error:
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
 reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
 install sbt manually from http://www.scala-sbt.org/


 So, help?  I'm sure these instructions work because people are following
 them every day, but I can't tell what they are supposed to do.

 Thanks!
 Diana





Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks Ongen.

Unfortunately I'm not able to follow your instructions either.  In
particular:


 sbt compile
 sbt run arguments if any


This doesn't work for me because there's no program on my path called
sbt.  The instructions in the Quick Start guide are specific that I
should call $SPARK_HOME/sbt/sbt.  I don't have any other executable on my
system called sbt.

Did you download and install sbt separately?  In following the Quick Start
guide, that was not stated as a requirement, and I'm trying to run through
the guide word for word.

Diana


On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski 
og...@plainvanillagames.com wrote:

 Diana,

 Anywhere on the filesystem you have read/write access (you need not be in
 your spark home directory):

 mkdir myproject
 cd myproject
 mkdir project
 mkdir target
 mkdir -p src/main/scala
 cp $mypath/$mymysource.scala src/main/scala/
 cp $mypath/myproject.sbt .

 Make sure that myproject.sbt has the following in it:

 name := I NEED A NAME!

 version := I NEED A VERSION!

 scalaVersion := 2.10.3

 libraryDependencies += org.apache.spark % spark-core_2.10 %
 0.9.0-incubating

 If you will be using Hadoop/HDFS functionality you will need the below
 line also

 libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0

 The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
 using 0.8.1 - adjust appropriately.

 That's it. Now you can do

 sbt compile
 sbt run arguments if any

 You can also do
 sbt package to produce a jar file of your code which you can then add to
 the SparkContext at runtime.

 In a more complicated project you may need to have a bit more involved
 hierarchy like com.github.dianacarroll which will then translate to
 src/main/scala/com/github/dianacarroll/ where you can put your multiple
 .scala files which will then have to be a part of a package
 com.github.dianacarroll (you can just put that as your first line in each
 of these scala files). I am new to Java/Scala so this is how I do it. More
 educated Java/Scala programmers may tell you otherwise ;)

 You can get more complicated with the sbt project subrirectory but you can
 read independently about sbt and what it can do, above is the bare minimum.

 Let me know if that helped.
 Ognen


 On 3/24/14, 2:44 PM, Diana Carroll wrote:

 Has anyone successfully followed the instructions on the Quick Start page
 of the Spark home page to run a standalone Scala application?  I can't,
 and I figure I must be missing something obvious!

 I'm trying to follow the instructions here as close to word for word as
 possible:
 http://spark.apache.org/docs/latest/quick-start.html#a-
 standalone-app-in-scala

 1.  The instructions don't say what directory to create my test
 application in, but later I'm instructed to run sbt/sbt so I conclude
 that my working directory must be $SPARK_HOME.  (Temporarily ignoring that
 it is a little weird to be working directly in the Spark distro.)

 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
  Copypaste in the code from the instructions exactly, replacing
 YOUR_SPARK_HOME with my spark home path.

 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt
 file from the instructions

 4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
 ENTIRE Spark project!  This takes several minutes, and at the end, it says
 Done packaging.  unfortunately, there's nothing in the
 $SPARK_HOME/mysparktest/ folder other than what I already had there.

 (Just for fun, I also did what I thought was more logical, which is set
 my working directory to $SPARK_HOME/mysparktest, and but
 $SPARK_HOME/sbt/sbt package, but that was even less successful: I got an
 error:
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
 for reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
 install sbt manually from http://www.scala-sbt.org/


 So, help?  I'm sure these instructions work because people are following
 them every day, but I can't tell what they are supposed to do.

 Thanks!
 Diana





Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Yeah, that's exactly what I did. Unfortunately it doesn't work:

$SPARK_HOME/sbt/sbt package
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
install sbt manually from http://www.scala-sbt.org/



On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski 
og...@plainvanillagames.com wrote:

  You can use any sbt on your machine, including the one that comes with
 spark. For example, try:

 ~/path_to_spark/sbt/sbt compile
 ~/path_to_spark/sbt/sbt run arguments

 Or you can just add that to your PATH by:

 export $PATH=$PATH:~/path_to_spark/sbt

 To make it permanent, you can add it to your ~/.bashrc or ~/.bash_profile
 or ??? depending on the system you are using. If you are on Windows, sorry,
 I can't offer any help there ;)

 Ognen


 On 3/24/14, 3:16 PM, Diana Carroll wrote:

 Thanks Ongen.

  Unfortunately I'm not able to follow your instructions either.  In
 particular:


 sbt compile
 sbt run arguments if any


  This doesn't work for me because there's no program on my path called
 sbt.  The instructions in the Quick Start guide are specific that I
 should call $SPARK_HOME/sbt/sbt.  I don't have any other executable on my
 system called sbt.

  Did you download and install sbt separately?  In following the Quick
 Start guide, that was not stated as a requirement, and I'm trying to run
 through the guide word for word.

  Diana


  On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski 
 og...@plainvanillagames.com wrote:

 Diana,

 Anywhere on the filesystem you have read/write access (you need not be in
 your spark home directory):

 mkdir myproject
 cd myproject
 mkdir project
 mkdir target
 mkdir -p src/main/scala
 cp $mypath/$mymysource.scala src/main/scala/
 cp $mypath/myproject.sbt .

 Make sure that myproject.sbt has the following in it:

 name := I NEED A NAME!

 version := I NEED A VERSION!

 scalaVersion := 2.10.3

 libraryDependencies += org.apache.spark % spark-core_2.10 %
 0.9.0-incubating

 If you will be using Hadoop/HDFS functionality you will need the below
 line also

 libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0

 The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
 using 0.8.1 - adjust appropriately.

 That's it. Now you can do

 sbt compile
 sbt run arguments if any

 You can also do
 sbt package to produce a jar file of your code which you can then add to
 the SparkContext at runtime.

 In a more complicated project you may need to have a bit more involved
 hierarchy like com.github.dianacarroll which will then translate to
 src/main/scala/com/github/dianacarroll/ where you can put your multiple
 .scala files which will then have to be a part of a package
 com.github.dianacarroll (you can just put that as your first line in each
 of these scala files). I am new to Java/Scala so this is how I do it. More
 educated Java/Scala programmers may tell you otherwise ;)

 You can get more complicated with the sbt project subrirectory but you
 can read independently about sbt and what it can do, above is the bare
 minimum.

 Let me know if that helped.
 Ognen


 On 3/24/14, 2:44 PM, Diana Carroll wrote:

 Has anyone successfully followed the instructions on the Quick Start
 page of the Spark home page to run a standalone Scala application?  I
 can't, and I figure I must be missing something obvious!

 I'm trying to follow the instructions here as close to word for word
 as possible:

 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

 1.  The instructions don't say what directory to create my test
 application in, but later I'm instructed to run sbt/sbt so I conclude
 that my working directory must be $SPARK_HOME.  (Temporarily ignoring that
 it is a little weird to be working directly in the Spark distro.)

 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
  Copypaste in the code from the instructions exactly, replacing
 YOUR_SPARK_HOME with my spark home path.

 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copypaste in the sbt
 file from the instructions

 4.  From the $SPARK_HOME I run sbt/sbt package.  It runs through the
 ENTIRE Spark project!  This takes several minutes, and at the end, it says
 Done packaging.  unfortunately, there's nothing in the
 $SPARK_HOME/mysparktest/ folder other than what I already had there.

 (Just for fun, I also did what I thought was more logical, which is set
 my working directory to $SPARK_HOME/mysparktest, and but
 $SPARK_HOME/sbt/sbt package, but that was even less successful: I got an
 error:
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
 for reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Ognen Duzlevski
Ah crud, I guess you are right, I am using the sbt I installed manually 
with my Scala installation.


Well, here is what you can do:
mkdir ~/bin
cd ~/bin
wget 
http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar

vi sbt

Put the following contents into your new file:

SBT_OPTS=-Xms512M -Xmx1536M -Xss1M -XX:+CMSClassUnloadingEnabled 
-XX:MaxPermSize=256M
java $SBT_OPTS -jar `dirname $0`/sbt-launch.jar $@

:wq!

chmod u+x sbt

Now you can do ~/bin/sbt compile ~/bin/sbt package, run etc.

Ognen

On 3/24/14, 3:30 PM, Diana Carroll wrote:

Yeah, that's exactly what I did. Unfortunately it doesn't work:

$SPARK_HOME/sbt/sbt package
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' 
for reading (No such file or directory)

Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
directory
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
directory
Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. 
Please install sbt manually from http://www.scala-sbt.org/




On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski 
og...@plainvanillagames.com mailto:og...@plainvanillagames.com wrote:


You can use any sbt on your machine, including the one that comes
with spark. For example, try:

~/path_to_spark/sbt/sbt compile
~/path_to_spark/sbt/sbt run arguments

Or you can just add that to your PATH by:

export $PATH=$PATH:~/path_to_spark/sbt

To make it permanent, you can add it to your ~/.bashrc or
~/.bash_profile or ??? depending on the system you are using. If
you are on Windows, sorry, I can't offer any help there ;)

Ognen


On 3/24/14, 3:16 PM, Diana Carroll wrote:

Thanks Ongen.

Unfortunately I'm not able to follow your instructions either.
 In particular:


sbt compile
sbt run arguments if any


This doesn't work for me because there's no program on my path
called sbt.  The instructions in the Quick Start guide are
specific that I should call $SPARK_HOME/sbt/sbt.  I don't have
any other executable on my system called sbt.

Did you download and install sbt separately?  In following the
Quick Start guide, that was not stated as a requirement, and I'm
trying to run through the guide word for word.

Diana


On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
og...@plainvanillagames.com
mailto:og...@plainvanillagames.com wrote:

Diana,

Anywhere on the filesystem you have read/write access (you
need not be in your spark home directory):

mkdir myproject
cd myproject
mkdir project
mkdir target
mkdir -p src/main/scala
cp $mypath/$mymysource.scala src/main/scala/
cp $mypath/myproject.sbt .

Make sure that myproject.sbt has the following in it:

name := I NEED A NAME!

version := I NEED A VERSION!

scalaVersion := 2.10.3

libraryDependencies += org.apache.spark % spark-core_2.10
% 0.9.0-incubating

If you will be using Hadoop/HDFS functionality you will need
the below line also

libraryDependencies += org.apache.hadoop % hadoop-client
% 2.2.0

The above assumes you are using Spark 0.9 and Scala 2.10.3.
If you are using 0.8.1 - adjust appropriately.

That's it. Now you can do

sbt compile
sbt run arguments if any

You can also do
sbt package to produce a jar file of your code which you can
then add to the SparkContext at runtime.

In a more complicated project you may need to have a bit more
involved hierarchy like com.github.dianacarroll which will
then translate to src/main/scala/com/github/dianacarroll/
where you can put your multiple .scala files which will then
have to be a part of a package com.github.dianacarroll (you
can just put that as your first line in each of these scala
files). I am new to Java/Scala so this is how I do it. More
educated Java/Scala programmers may tell you otherwise ;)

You can get more complicated with the sbt project
subrirectory but you can read independently about sbt and
what it can do, above is the bare minimum.

Let me know if that helped.
Ognen


On 3/24/14, 2:44 PM, Diana Carroll wrote:

Has anyone successfully followed the instructions on the
Quick Start page of the Spark home page to run a
standalone Scala application?  I can't, and I figure I
must be missing something obvious!

I'm trying to follow the instructions here as close to
word for word as possible:

http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala

1.  The instructions don't say what directory to create
my test 

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Hi, Diana,   

You don’t need to use spark-distributed sbt

just download sbt from its official website and set your PATH to the right place

Best,  

--  
Nan Zhu



On Monday, March 24, 2014 at 4:30 PM, Diana Carroll wrote:

 Yeah, that's exactly what I did. Unfortunately it doesn't work:
  
 $SPARK_HOME/sbt/sbt package
 awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for 
 reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please 
 install sbt manually from http://www.scala-sbt.org/
  
  
  
  
 On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski og...@plainvanillagames.com 
 (mailto:og...@plainvanillagames.com) wrote:
  You can use any sbt on your machine, including the one that comes with 
  spark. For example, try:
   
  ~/path_to_spark/sbt/sbt compile
  ~/path_to_spark/sbt/sbt run arguments
   
  Or you can just add that to your PATH by:
   
  export $PATH=$PATH:~/path_to_spark/sbt
   
  To make it permanent, you can add it to your ~/.bashrc or ~/.bash_profile 
  or ??? depending on the system you are using. If you are on Windows, sorry, 
  I can't offer any help there ;)
   
  Ognen
   
   
  On 3/24/14, 3:16 PM, Diana Carroll wrote:
   Thanks Ongen.  

   Unfortunately I'm not able to follow your instructions either.  In 
   particular:
 
sbt compile
sbt run arguments if any

   This doesn't work for me because there's no program on my path called 
   sbt.  The instructions in the Quick Start guide are specific that I 
   should call $SPARK_HOME/sbt/sbt.  I don't have any other executable on 
   my system called sbt.  

   Did you download and install sbt separately?  In following the Quick 
   Start guide, that was not stated as a requirement, and I'm trying to run 
   through the guide word for word.  

   Diana  


   On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski 
   og...@plainvanillagames.com (mailto:og...@plainvanillagames.com) wrote:
Diana,
 
Anywhere on the filesystem you have read/write access (you need not be 
in your spark home directory):
 
mkdir myproject
cd myproject
mkdir project
mkdir target
mkdir -p src/main/scala
cp $mypath/$mymysource.scala src/main/scala/
cp $mypath/myproject.sbt .
 
Make sure that myproject.sbt has the following in it:
 
name := I NEED A NAME!
 
version := I NEED A VERSION!
 
scalaVersion := 2.10.3
 
libraryDependencies += org.apache.spark % spark-core_2.10 % 
0.9.0-incubating
 
If you will be using Hadoop/HDFS functionality you will need the below 
line also
 
libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0
 
The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are 
using 0.8.1 - adjust appropriately.
 
That's it. Now you can do
 
sbt compile
sbt run arguments if any
 
You can also do
sbt package to produce a jar file of your code which you can then add 
to the SparkContext at runtime.
 
In a more complicated project you may need to have a bit more involved 
hierarchy like com.github.dianacarroll which will then translate to 
src/main/scala/com/github/dianacarroll/ where you can put your multiple 
.scala files which will then have to be a part of a package 
com.github.dianacarroll (you can just put that as your first line in 
each of these scala files). I am new to Java/Scala so this is how I do 
it. More educated Java/Scala programmers may tell you otherwise ;)
 
You can get more complicated with the sbt project subrirectory but you 
can read independently about sbt and what it can do, above is the bare 
minimum.
 
Let me know if that helped.
Ognen  
 
 
On 3/24/14, 2:44 PM, Diana Carroll wrote:
 Has anyone successfully followed the instructions on the Quick Start 
 page of the Spark home page to run a standalone Scala application?  
 I can't, and I figure I must be missing something obvious!
  
 I'm trying to follow the instructions here as close to word for 
 word as possible:
 http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
  
 1.  The instructions don't say what directory to create my test 
 application in, but later I'm instructed to run sbt/sbt so I 
 conclude that my working directory must be $SPARK_HOME.  (Temporarily 
 ignoring that it is a little weird to be working directly in the 
 Spark distro.)
  
 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.  
 Copypaste in the code from the instructions exactly, replacing 
 YOUR_SPARK_HOME with my spark home path.
  
 3.  

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
Thanks for your help, everyone.  Several folks have explained that I can
surely solve the problem by installing sbt.

But I'm trying to get the instructions working *as written on the Spark
website*.  The instructions not only don't have you install sbt
separately...they actually specifically have you use the sbt that is
distributed with Spark.

If it is not possible to build your own Spark programs with
Spark-distributed sbt, then that's a big hole in the Spark docs that I
shall file.  And if the sbt that is included with Spark is MEANT to be able
to compile your own Spark apps, then that's a product bug.

But before I file the bug, I'm still hoping I'm missing something, and
someone will point out that I'm missing a small step that will make the
Spark distribution of sbt work!

Diana



On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska yana.kadiy...@gmail.comwrote:

 Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
 (since like other folks I had sbt preinstalled on my usual machine)

 I ran the command exactly as Ognen suggested and see
 Set current project to Simple Project (do you see this -- you should
 at least be seeing this)
 and then a bunch of Resolving ...

 messages. I did get an error there, saying it can't find
 javax.servlet.orbit. I googled the error and found this thread:


 http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E

 adding the IvyXML fragment they suggested helped in my case (but
 again, the build pretty clearly complained).

 If you're still having no luck, I suggest installing sbt and setting
 SBT_HOME... http://www.scala-sbt.org/

 In either case though, it's not a Spark-specific issue...Hopefully
 some of all this helps.

 On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll dcarr...@cloudera.com
 wrote:
  Yeah, that's exactly what I did. Unfortunately it doesn't work:
 
  $SPARK_HOME/sbt/sbt package
  awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
 for
  reading (No such file or directory)
  Attempting to fetch sbt
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
  install sbt manually from http://www.scala-sbt.org/
 
 
 
  On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
  og...@plainvanillagames.com wrote:
 
  You can use any sbt on your machine, including the one that comes with
  spark. For example, try:
 
  ~/path_to_spark/sbt/sbt compile
  ~/path_to_spark/sbt/sbt run arguments
 
  Or you can just add that to your PATH by:
 
  export $PATH=$PATH:~/path_to_spark/sbt
 
  To make it permanent, you can add it to your ~/.bashrc or
 ~/.bash_profile
  or ??? depending on the system you are using. If you are on Windows,
 sorry,
  I can't offer any help there ;)
 
  Ognen
 
 
  On 3/24/14, 3:16 PM, Diana Carroll wrote:
 
  Thanks Ongen.
 
  Unfortunately I'm not able to follow your instructions either.  In
  particular:
 
 
  sbt compile
  sbt run arguments if any
 
 
  This doesn't work for me because there's no program on my path called
  sbt.  The instructions in the Quick Start guide are specific that I
 should
  call $SPARK_HOME/sbt/sbt.  I don't have any other executable on my
 system
  called sbt.
 
  Did you download and install sbt separately?  In following the Quick
 Start
  guide, that was not stated as a requirement, and I'm trying to run
 through
  the guide word for word.
 
  Diana
 
 
  On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
  og...@plainvanillagames.com wrote:
 
  Diana,
 
  Anywhere on the filesystem you have read/write access (you need not be
 in
  your spark home directory):
 
  mkdir myproject
  cd myproject
  mkdir project
  mkdir target
  mkdir -p src/main/scala
  cp $mypath/$mymysource.scala src/main/scala/
  cp $mypath/myproject.sbt .
 
  Make sure that myproject.sbt has the following in it:
 
  name := I NEED A NAME!
 
  version := I NEED A VERSION!
 
  scalaVersion := 2.10.3
 
  libraryDependencies += org.apache.spark % spark-core_2.10 %
  0.9.0-incubating
 
  If you will be using Hadoop/HDFS functionality you will need the below
  line also
 
  libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0
 
  The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
  using 0.8.1 - adjust appropriately.
 
  That's it. Now you can do
 
  sbt compile
  sbt run arguments if any
 
  You can also do
  sbt package to produce a jar file of your code which you can then add
 to
  the SparkContext at runtime.
 
  In a more complicated project you may need to have a bit more involved
  hierarchy like com.github.dianacarroll which will then translate to
  src/main/scala/com/github/dianacarroll/ where you can put your multiple
  .scala files which will then have to be a part of a package
  com.github.dianacarroll (you can just put 

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Yana Kadiyska
Diana, I think you are correct - I just installed
 wget 
http://mirror.symnds.com/software/Apache/incubator/spark/spark-0.9.0-incubating/spark-0.9.0-incubating-bin-cdh4.tgz
and indeed I see the same error that you see

It looks like in previous versions sbt-launch used to just come down
in the package, but now they try to get it for you -- and that code
seems to have some assumptions on where it is being invoked from

On Mon, Mar 24, 2014 at 5:47 PM, Diana Carroll dcarr...@cloudera.com wrote:
 Thanks for your help, everyone.  Several folks have explained that I can
 surely solve the problem by installing sbt.

 But I'm trying to get the instructions working as written on the Spark
 website.  The instructions not only don't have you install sbt
 separately...they actually specifically have you use the sbt that is
 distributed with Spark.

 If it is not possible to build your own Spark programs with
 Spark-distributed sbt, then that's a big hole in the Spark docs that I shall
 file.  And if the sbt that is included with Spark is MEANT to be able to
 compile your own Spark apps, then that's a product bug.

 But before I file the bug, I'm still hoping I'm missing something, and
 someone will point out that I'm missing a small step that will make the
 Spark distribution of sbt work!

 Diana



 On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska yana.kadiy...@gmail.com
 wrote:

 Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
 (since like other folks I had sbt preinstalled on my usual machine)

 I ran the command exactly as Ognen suggested and see
 Set current project to Simple Project (do you see this -- you should
 at least be seeing this)
 and then a bunch of Resolving ...

 messages. I did get an error there, saying it can't find
 javax.servlet.orbit. I googled the error and found this thread:


 http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E

 adding the IvyXML fragment they suggested helped in my case (but
 again, the build pretty clearly complained).

 If you're still having no luck, I suggest installing sbt and setting
 SBT_HOME... http://www.scala-sbt.org/

 In either case though, it's not a Spark-specific issue...Hopefully
 some of all this helps.

 On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll dcarr...@cloudera.com
 wrote:
  Yeah, that's exactly what I did. Unfortunately it doesn't work:
 
  $SPARK_HOME/sbt/sbt package
  awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
  for
  reading (No such file or directory)
  Attempting to fetch sbt
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  Our attempt to download sbt locally to sbt/sbt-launch-.jar failed.
  Please
  install sbt manually from http://www.scala-sbt.org/
 
 
 
  On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
  og...@plainvanillagames.com wrote:
 
  You can use any sbt on your machine, including the one that comes with
  spark. For example, try:
 
  ~/path_to_spark/sbt/sbt compile
  ~/path_to_spark/sbt/sbt run arguments
 
  Or you can just add that to your PATH by:
 
  export $PATH=$PATH:~/path_to_spark/sbt
 
  To make it permanent, you can add it to your ~/.bashrc or
  ~/.bash_profile
  or ??? depending on the system you are using. If you are on Windows,
  sorry,
  I can't offer any help there ;)
 
  Ognen
 
 
  On 3/24/14, 3:16 PM, Diana Carroll wrote:
 
  Thanks Ongen.
 
  Unfortunately I'm not able to follow your instructions either.  In
  particular:
 
 
  sbt compile
  sbt run arguments if any
 
 
  This doesn't work for me because there's no program on my path called
  sbt.  The instructions in the Quick Start guide are specific that I
  should
  call $SPARK_HOME/sbt/sbt.  I don't have any other executable on my
  system
  called sbt.
 
  Did you download and install sbt separately?  In following the Quick
  Start
  guide, that was not stated as a requirement, and I'm trying to run
  through
  the guide word for word.
 
  Diana
 
 
  On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
  og...@plainvanillagames.com wrote:
 
  Diana,
 
  Anywhere on the filesystem you have read/write access (you need not be
  in
  your spark home directory):
 
  mkdir myproject
  cd myproject
  mkdir project
  mkdir target
  mkdir -p src/main/scala
  cp $mypath/$mymysource.scala src/main/scala/
  cp $mypath/myproject.sbt .
 
  Make sure that myproject.sbt has the following in it:
 
  name := I NEED A NAME!
 
  version := I NEED A VERSION!
 
  scalaVersion := 2.10.3
 
  libraryDependencies += org.apache.spark % spark-core_2.10 %
  0.9.0-incubating
 
  If you will be using Hadoop/HDFS functionality you will need the below
  line also
 
  libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0
 
  The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
  using 0.8.1 - adjust appropriately.
 
 

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
I found that I never read the document carefully and I never find that Spark 
document is suggesting you to use Spark-distributed sbt……  

Best,

--  
Nan Zhu



On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote:

 Thanks for your help, everyone.  Several folks have explained that I can 
 surely solve the problem by installing sbt.
  
 But I'm trying to get the instructions working as written on the Spark 
 website.  The instructions not only don't have you install sbt 
 separately...they actually specifically have you use the sbt that is 
 distributed with Spark.  
  
 If it is not possible to build your own Spark programs with Spark-distributed 
 sbt, then that's a big hole in the Spark docs that I shall file.  And if the 
 sbt that is included with Spark is MEANT to be able to compile your own Spark 
 apps, then that's a product bug.  
  
 But before I file the bug, I'm still hoping I'm missing something, and 
 someone will point out that I'm missing a small step that will make the Spark 
 distribution of sbt work!
  
 Diana
  
  
  
 On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska yana.kadiy...@gmail.com 
 (mailto:yana.kadiy...@gmail.com) wrote:
  Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
  (since like other folks I had sbt preinstalled on my usual machine)
   
  I ran the command exactly as Ognen suggested and see
  Set current project to Simple Project (do you see this -- you should
  at least be seeing this)
  and then a bunch of Resolving ...
   
  messages. I did get an error there, saying it can't find
  javax.servlet.orbit. I googled the error and found this thread:
   
  http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E
   
  adding the IvyXML fragment they suggested helped in my case (but
  again, the build pretty clearly complained).
   
  If you're still having no luck, I suggest installing sbt and setting
  SBT_HOME... http://www.scala-sbt.org/
   
  In either case though, it's not a Spark-specific issue...Hopefully
  some of all this helps.
   
  On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll dcarr...@cloudera.com 
  (mailto:dcarr...@cloudera.com) wrote:
   Yeah, that's exactly what I did. Unfortunately it doesn't work:
  
   $SPARK_HOME/sbt/sbt package
   awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
   reading (No such file or directory)
   Attempting to fetch sbt
   /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
   directory
   /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
   directory
   Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
   install sbt manually from http://www.scala-sbt.org/
  
  
  
   On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
   og...@plainvanillagames.com (mailto:og...@plainvanillagames.com) wrote:
  
   You can use any sbt on your machine, including the one that comes with
   spark. For example, try:
  
   ~/path_to_spark/sbt/sbt compile
   ~/path_to_spark/sbt/sbt run arguments
  
   Or you can just add that to your PATH by:
  
   export $PATH=$PATH:~/path_to_spark/sbt
  
   To make it permanent, you can add it to your ~/.bashrc or ~/.bash_profile
   or ??? depending on the system you are using. If you are on Windows, 
   sorry,
   I can't offer any help there ;)
  
   Ognen
  
  
   On 3/24/14, 3:16 PM, Diana Carroll wrote:
  
   Thanks Ongen.
  
   Unfortunately I'm not able to follow your instructions either.  In
   particular:
  
  
   sbt compile
   sbt run arguments if any
  
  
   This doesn't work for me because there's no program on my path called
   sbt.  The instructions in the Quick Start guide are specific that I 
   should
   call $SPARK_HOME/sbt/sbt.  I don't have any other executable on my 
   system
   called sbt.
  
   Did you download and install sbt separately?  In following the Quick 
   Start
   guide, that was not stated as a requirement, and I'm trying to run 
   through
   the guide word for word.
  
   Diana
  
  
   On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
   og...@plainvanillagames.com (mailto:og...@plainvanillagames.com) wrote:
  
   Diana,
  
   Anywhere on the filesystem you have read/write access (you need not be 
   in
   your spark home directory):
  
   mkdir myproject
   cd myproject
   mkdir project
   mkdir target
   mkdir -p src/main/scala
   cp $mypath/$mymysource.scala src/main/scala/
   cp $mypath/myproject.sbt .
  
   Make sure that myproject.sbt has the following in it:
  
   name := I NEED A NAME!
  
   version := I NEED A VERSION!
  
   scalaVersion := 2.10.3
  
   libraryDependencies += org.apache.spark % spark-core_2.10 %
   0.9.0-incubating
  
   If you will be using Hadoop/HDFS functionality you will need the below
   line also
  
   libraryDependencies += org.apache.hadoop % hadoop-client % 2.2.0
  
   The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
   using 0.8.1 

Re: quick start guide: building a standalone scala program

2014-03-24 Thread Diana Carroll
It is suggested implicitly in giving you the command ./sbt/sbt. The
separately installed sbt isn't in a folder called sbt, whereas Spark's
version is.  And more relevantly, just a few paragraphs earlier in the
tutorial you execute the command sbt/sbt assembly which definitely refers
to the spark install.

On Monday, March 24, 2014, Nan Zhu zhunanmcg...@gmail.com wrote:

 I found that I never read the document carefully and I never find that
 Spark document is suggesting you to use Spark-distributed sbt..

 Best,

 --
 Nan Zhu


 On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote:

 Thanks for your help, everyone.  Several folks have explained that I can
 surely solve the problem by installing sbt.

 But I'm trying to get the instructions working *as written on the Spark
 website*.  The instructions not only don't have you install sbt
 separately...they actually specifically have you use the sbt that is
 distributed with Spark.

 If it is not possible to build your own Spark programs with
 Spark-distributed sbt, then that's a big hole in the Spark docs that I
 shall file.  And if the sbt that is included with Spark is MEANT to be able
 to compile your own Spark apps, then that's a product bug.

 But before I file the bug, I'm still hoping I'm missing something, and
 someone will point out that I'm missing a small step that will make the
 Spark distribution of sbt work!

 Diana



 On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska yana.kadiy...@gmail.comwrote:

 Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
 (since like other folks I had sbt preinstalled on my usual machine)

 I ran the command exactly as Ognen suggested and see
 Set current project to Simple Project (do you see this -- you should
 at least be seeing this)
 and then a bunch of Resolving ...

 messages. I did get an error there, saying it can't find
 javax.servlet.orbit. I googled the error and found this thread:


 http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E

 adding the IvyXML fragment they suggested helped in my case (but
 again, the build pretty clearly complained).

 If you're still having no luck, I suggest installing sbt and setting
 SBT_HOME... http://www.scala-sbt.org/

 In either case though, it's not a Spark-specific issue...Hopefully
 some of all this helps.

 On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll dcarr...@cloudera.com
 wrote:
  Yeah, that's exactly what I did. Unfortunately it doesn't work:
 
  $SPARK_HOME/sbt/sbt package
  awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
 for
  reading (No such file or directory)
  Attempting to fetch sbt
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
  directory
  Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
  install sbt manually from http://www.scala-sbt.org/
 
 
 
  On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
  og...@plainvanillagames.com wrote:
 
  You can use any sbt on your machine, including the one that comes with
  spark. For example, try:
 
  ~/path_to_spark/sbt/sbt compile
  ~/path_to_spark/sbt/sbt run arguments
 
  Or you can just add that to your PATH by:
 
  export $PATH=$PATH:~/path_to_spark/sbt
 
  To make it permanent, you can add it to your ~/.bashrc or
 ~/.bash_profile
  or ??? depending on the system you are using. If you are on Windows,
 sorry,
  I can't offer any help there ;)
 
  Ognen
 
 
  On 3/24/14, 3:16 PM, Diana Carroll wrote:
 
  Thanks Ongen.
 
  Unfortunately I'm not able to follow your instructions either.  In
  particular:
 
 
  sbt compile
  sbt run arguments if any
 
 
  This doesn't work for me because there's no program on my path called
  sbt.  The instructions in the Quick Start guide are specific that I
 sho




Re: quick start guide: building a standalone scala program

2014-03-24 Thread Nan Zhu
Yes, actually even for spark, I mostly use the sbt I installed…..so always 
missing this issue….

If you can reproduce the problem with a spark-distribtued sbt…I suggest 
proposing a PR to fix the document, before 0.9.1 is officially released  

Best,  

--  
Nan Zhu



On Monday, March 24, 2014 at 8:34 PM, Diana Carroll wrote:

 It is suggested implicitly in giving you the command ./sbt/sbt. The 
 separately installed sbt isn't in a folder called sbt, whereas Spark's 
 version is.  And more relevantly, just a few paragraphs earlier in the 
 tutorial you execute the command sbt/sbt assembly which definitely refers 
 to the spark install.  
  
 On Monday, March 24, 2014, Nan Zhu zhunanmcg...@gmail.com 
 (mailto:zhunanmcg...@gmail.com) wrote:
  I found that I never read the document carefully and I never find that 
  Spark document is suggesting you to use Spark-distributed sbt……  
   
  Best,
   
  --  
  Nan Zhu
   
   
   
  On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote:
   
   Thanks for your help, everyone.  Several folks have explained that I can 
   surely solve the problem by installing sbt.

   But I'm trying to get the instructions working as written on the Spark 
   website.  The instructions not only don't have you install sbt 
   separately...they actually specifically have you use the sbt that is 
   distributed with Spark.  

   If it is not possible to build your own Spark programs with 
   Spark-distributed sbt, then that's a big hole in the Spark docs that I 
   shall file.  And if the sbt that is included with Spark is MEANT to be 
   able to compile your own Spark apps, then that's a product bug.  

   But before I file the bug, I'm still hoping I'm missing something, and 
   someone will point out that I'm missing a small step that will make the 
   Spark distribution of sbt work!

   Diana



   On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska yana.kadiy...@gmail.com 
   wrote:
Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
(since like other folks I had sbt preinstalled on my usual machine)
 
I ran the command exactly as Ognen suggested and see
Set current project to Simple Project (do you see this -- you should
at least be seeing this)
and then a bunch of Resolving ...
 
messages. I did get an error there, saying it can't find
javax.servlet.orbit. I googled the error and found this thread:
 
http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E
 
adding the IvyXML fragment they suggested helped in my case (but
again, the build pretty clearly complained).
 
If you're still having no luck, I suggest installing sbt and setting
SBT_HOME... http://www.scala-sbt.org/
 
In either case though, it's not a Spark-specific issue...Hopefully
some of all this helps.
 
On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll dcarr...@cloudera.com 
wrote:
 Yeah, that's exactly what I did. Unfortunately it doesn't work:

 $SPARK_HOME/sbt/sbt package
 awk: cmd. line:1: fatal: cannot open file 
 `./project/build.properties' for
 reading (No such file or directory)
 Attempting to fetch sbt
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
 directory
 Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. 
 Please
 install sbt manually from http://www.scala-sbt.org/



 On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
 og...@plainvanillagames.com wrote:

 You can use any sbt on your machine, including the one that comes 
 with
 spark. For example, try:

 ~/path_to_spark/sbt/sbt compile
 ~/path_to_spark/sbt/sbt run arguments

 Or you can just add that to your PATH by:

 export $PATH=$PATH:~/path_to_spark/sbt

 To make it permanent, you can add it to your ~/.bashrc or 
 ~/.bash_profile
 or ??? depending on the system you are using. If you are on Windows, 
 sorry,
 I can't offer any help there ;)

 Ognen


 On 3/24/14, 3:16 PM, Diana Carroll wrote:

 Thanks Ongen.

 Unfortunately I'm not able to follow your instructions either.  In
 particular:


 sbt compile
 sbt run arguments if any


 This doesn't work for me because there's no program on my path called
 sbt.  The instructions in the Quick Start guide are specific that 
 I sho