Hi,
        I am new to Spark and Scala. As a part of one of my projects, I am 
trying to build and locally publish spark-0.8.0-incubating on an Amazon ec2 
cluster.
        After setting up all the java class paths and options, when I run :
        **      sbt/sbt compile , OR
        **      sbt/sbt assembly, OR
        **      sbt/sbt publish-local 
        The command runs for some time (approx 10 mins) and after that the java 
command simply gets killed. No error message is thrown out.     Here is a small 
snapshot of the messages:

        [info] Updating {file:/home/ec2-user/spark-0.8.0-incubating/}bagel... 
[info] Resolving cglib#cglib-nodep;2.2.2 … 
        [info] Done updating.
        [info] Compiling 258 Scala sources and 16 Java sources to 
/home/ec2-user/spark-0.8.0-incubating/core/target/scala-2.9.3/classes... 
sbt/sbt: line 30: 21454 Killed java -Xmx1200m -XX:MaxPermSize=350m 
-XX:ReservedCodeCacheSize=256m $EXTRA_ARGS $SBT_OPTS -jar 
"$SPARK_HOME"/sbt/sbt-launch-*.jar "$@"

        When I run the jstat, I get the following output:
        Timestamp S0 S1 E O P YGC YGCT FGC FGCT GCT LGCC GCC 
         257.3 100.00 0.00 50.66 82.97 99.59 156 5.163 10 12.076 17.239 unknown 
GCCause No GC 
         365.2 0.00 100.00 53.13 88.96 99.94 157 7.563 10 12.076 19.639 unknown 
GCCause No GC 
         386.6 0.00 0.00 1.97 60.00 99.51 157 7.563 11 25.281 32.844 Permanent 
Generation Full No GC 
         407.9 0.00 0.00 29.53 60.00 99.97 157 7.563 11 25.281 32.844 Permanent 
Generation Full No GC 
         578.8 64.82 0.00 41.90 77.75 99.68 162 10.896 11 25.281 36.178 unknown 
GCCause No GC 
         600.1 64.82 0.00 91.90 77.75 99.72 162 10.896 11 25.281 36.178 unknown 
GCCause No GC 
        664.2 77.92 70.32 100.00 99.94 99.71 168 12.451 11 25.281 37.732 
unknown GCCause Allocation Failure

        I changed the memory limits as :  -XX:MaxPermSize=720m and  
-XX:ReservedCodeCacheSize=512m, but still the problem persists.  
        I am not able to figure out the reason why the command is getting 
killed. Please let me know if I need to do some other checks. I read through 
many links on google and spark site as well but was not able to get any insight 
into this problem.

        I am using the following-
        Java version: 6
        Jvm :  1.6.0-openjdk.x86_64
        Scala Version : 2.9.3 installed

        Any help would be deeply appreciated.

Thanks,

Reply via email to