i have found that i am unable to build/test spark with sbt and java6, but
using java7 works (and it compiles with java target version 1.6 so binaries
are usable from java 6)


On Sat, Mar 22, 2014 at 3:11 PM, Bharath Bhushan <manku.ti...@outlook.com>wrote:

> Thanks for the reply. It turns out that my ubuntu-in-vagrant had 512MB of
> ram only. Increasing it to 1024MB allowed the assembly to finish
> successfully. Peak usage was around 780MB.
>
> ------------------------------
> To: user@spark.apache.org
> From: vboylin1...@gmail.com
> Subject: 答复: unable to build spark - sbt/sbt: line 50: killed
> Date: Sat, 22 Mar 2014 20:03:28 +0800
>
>
>  Large memory is need to build spark, I think you should make xmx larger,
> 2g for example.
>  ------------------------------
> 发件人: Bharath Bhushan <manku.ti...@outlook.com>
> 发送时间: 2014/3/22 12:50
> 收件人: user@spark.apache.org
> 主题: unable to build spark - sbt/sbt: line 50: killed
>
> I am getting the following error when trying to build spark. I tried
> various sizes for the -Xmx and other memory related arguments to the java
> command line, but the assembly command still fails.
>
> $ sbt/sbt assembly
> ...
> [info] Compiling 298 Scala sources and 17 Java sources to
> /vagrant/spark-0.9.0-incubating-bin-hadoop2/core/target/scala-2.10/classes...
> sbt/sbt: line 50: 10202 Killed                  java -Xmx1900m
> -XX:MaxPermSize=1000m -XX:ReservedCodeCacheSize=256m -jar ${JAR} "$@"
>
> Versions of software:
> Spark: 0.9.0 (hadoop2 binary)
> Scala: 2.10.3
> Ubuntu: Ubuntu 12.04.4 LTS - Linux vagrant-ubuntu-precise-64
> 3.2.0-54-generic
> Java: 1.6.0_45 (oracle java 6)
>
> I can still use the binaries in bin/ but I was just trying to check if
> "sbt/sbt assembly" works fine.
>
> -- Thanks
>

Reply via email to