Hi, 

I have started a EC2 cluster using Spark by running spark-ec2 script.

Just a little confused, I can not find sbt/ directory under /spark.

I have checked spark-version, it's 1.0.0 (default). When I was working
0.9.x, sbt/ has been there.

Is the script changed in 1.0.X ? I can not find any change log on this. Or
maybe I am missing something.

Certainly, I can download sbt and make things work. Just want to make things
clear.

Thank you.

Here is the file list of spark/

root@ip-10-81-154-223:~# ls -l spark
total 384
drwxrwxr-x 10 1000 1000   4096 Jul 28 14:58 .
drwxr-xr-x 20 root root   4096 Jul 28 14:58 ..
drwxrwxr-x  2 1000 1000   4096 Jul 28 13:34 bin
-rw-rw-r--  1 1000 1000 281471 May 26 07:02 CHANGES.txt
drwxrwxr-x  2 1000 1000   4096 Jul 28 08:22 conf
drwxrwxr-x  4 1000 1000   4096 May 26 07:02 ec2
drwxrwxr-x  3 1000 1000   4096 May 26 07:02 examples
drwxrwxr-x  2 1000 1000   4096 May 26 07:02 lib
-rw-rw-r--  1 1000 1000  29983 May 26 07:02 LICENSE
drwxr-xr-x  2 root root   4096 Jul 28 14:42 logs
-rw-rw-r--  1 1000 1000  22559 May 26 07:02 NOTICE
drwxrwxr-x  6 1000 1000   4096 May 26 07:02 python
-rw-rw-r--  1 1000 1000   4221 May 26 07:02 README.md
-rw-rw-r--  1 1000 1000     35 May 26 07:02 RELEASE
drwxrwxr-x  2 1000 1000   4096 May 26 07:02 sbin









--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/sbt-directory-missed-tp10783.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to