Re: 1.5 Build Errors

2015-10-06 Thread Benjamin Zaitlen
Hi All,

Sean patiently worked with me in solving this issue.  The problem was
entirely my fault in settings MAVEN_OPTS env variable was set and was
overriding everything.

--Ben

On Tue, Sep 8, 2015 at 1:37 PM, Benjamin Zaitlen  wrote:

> Yes, just reran with the following
>
> (spark_build)root@ip-10-45-130-206:~/spark# export MAVEN_OPTS="-Xmx4096mb
>> -XX:MaxPermSize=1024M -XX:ReservedCodeCacheSize=1024m"
>> (spark_build)root@ip-10-45-130-206:~/spark# build/mvn -Pyarn
>> -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
>
>
> and grepping for java
>
>
> root   641  9.9  0.3 4411732 49040 pts/4   Sl+  17:35   0:01
>> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
>> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
>> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
>> com.typesafe.zinc.Nailgun 3030 0
>> root   687  226  2.0 1803664 312876 pts/4  Sl+  17:36   0:22
>> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
>> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
>> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
>> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
>> -Dmaven.multiModuleProjectDirectory=/root/spark
>> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 -Pyarn
>> -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
>
>
> On Tue, Sep 8, 2015 at 1:14 PM, Sean Owen  wrote:
>
>> MAVEN_OPTS shouldn't affect zinc as it's an unrelated application. You
>> can run "zinc -J-Xmx4g..." in general, but in the provided script,
>> ZINC_OPTS seems to be the equivalent, yes. It kind of looks like your
>> mvn process isn't getting any special memory args there. Is MAVEN_OPTS
>> really exported?
>>
>> FWIW I use my own local mvn and zinc and it works fine.
>>
>> On Tue, Sep 8, 2015 at 6:05 PM, Benjamin Zaitlen 
>> wrote:
>> > I'm running zinv while compiling.  It seems that MAVEN_OPTS doesn't
>> really
>> > change much?  Or perhaps I'm misunderstanding something -- grepping for
>> java
>> > i see
>> >
>> >> root 24355  102  8.8 4687376 1350724 pts/4 Sl   16:51  11:08
>> >> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
>> >> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>> >> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
>> >>
>> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
>> >> com.typesafe.zinc.Nailgun 3030 0
>> >> root 25151 22.0  3.2 2269092 495276 pts/4  Sl+  16:53   1:56
>> >> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
>> >>
>> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
>> >>
>> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
>> >> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
>> >> -Dmaven.multiModuleProjectDirectory=/root/spark
>> >> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 clean
>> >> package -DskipTests -Pyarn -Phive -Phive-thriftserver -Phadoop-2.4
>> >> -Dhadoop.version=2.4.0
>> >
>> >
>> > So the heap size is still 2g even with MAVEN_OPTS set with 4g.  I
>> noticed
>> > that within build/mvn _COMPILE_JVM_OPTS is set to 2g and this is what
>> > ZINC_OPTS is set to.
>> >
>> > --Ben
>> >
>> >
>> > On Tue, Sep 8, 2015 at 11:06 AM, Ted Yu  wrote:
>> >>
>> >> Do you run Zinc while compiling ?
>> >>
>> >> Cheers
>> >>
>> >> On Tue, Sep 8, 2015 at 7:56 AM, Benjamin Zaitlen 
>> >> wrote:
>> >>>
>> >>> I'm still getting errors with 3g.  I've increase to 4g and I'll report
>> >>> back
>> >>>
>> >>> To be clear:
>> >>>
>> >>> export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
>> >>> -XX:ReservedCodeCacheSize=1024m"
>> >>>
>>  [ERROR] GC overhead limit exceeded -> [Help 1]
>>  [ERROR]
>>  [ERROR] To see the full stack trace of the errors, re-run Maven with
>> the
>>  -e switch.
>>  [ERROR] Re-run Maven using the -X switch to enable full debug
>> logging.
>>  [ERROR]
>>  [ERROR] For more 

Re: 1.5 Build Errors

2015-09-08 Thread Benjamin Zaitlen
Ah, right.  Should've caught that.

The docs seem to recommend 2gb.  Should that be increased as well?

--Ben

On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:

> It shows you there that Maven is out of memory. Give it more heap. I use
> 3gb.
>
> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
> wrote:
> > Hi All,
> >
> > I'm trying to build a distribution off of the latest in master and I keep
> > getting errors on MQTT and the build fails.   I'm running the build on a
> > m1.large which has 7.5 GB of RAM and no other major processes are
> running.
> >
> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz -Pyarn
> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
> >
> >
> >
> >> INFO] Spark Project GraphX ... SUCCESS [
> >> 33.345 s]
> >> [INFO] Spark Project Streaming  SUCCESS
> [01:08
> >> min]
> >> [INFO] Spark Project Catalyst . SUCCESS
> [01:39
> >> min]
> >> [INFO] Spark Project SQL .. SUCCESS
> [02:06
> >> min]
> >> [INFO] Spark Project ML Library ... SUCCESS
> [02:16
> >> min]
> >> [INFO] Spark Project Tools  SUCCESS [
> >> 4.087 s]
> >> [INFO] Spark Project Hive . SUCCESS
> [01:28
> >> min]
> >> [INFO] Spark Project REPL . SUCCESS [
> >> 16.291 s]
> >> [INFO] Spark Project YARN Shuffle Service . SUCCESS [
> >> 13.671 s]
> >> [INFO] Spark Project YARN . SUCCESS [
> >> 20.554 s]
> >> [INFO] Spark Project Hive Thrift Server ... SUCCESS [
> >> 14.332 s]
> >> [INFO] Spark Project Assembly . SUCCESS
> [03:33
> >> min]
> >> [INFO] Spark Project External Twitter . SUCCESS [
> >> 14.208 s]
> >> [INFO] Spark Project External Flume Sink .. SUCCESS [
> >> 11.535 s]
> >> [INFO] Spark Project External Flume ... SUCCESS [
> >> 19.010 s]
> >> [INFO] Spark Project External Flume Assembly .. SUCCESS [
> >> 5.210 s]
> >> [INFO] Spark Project External MQTT  FAILURE
> [01:10
> >> min]
> >> [INFO] Spark Project External MQTT Assembly ... SKIPPED
> >> [INFO] Spark Project External ZeroMQ .. SKIPPED
> >> [INFO] Spark Project External Kafka ... SKIPPED
> >> [INFO] Spark Project Examples . SKIPPED
> >> [INFO] Spark Project External Kafka Assembly .. SKIPPED
> >> [INFO]
> >> 
> >> [INFO] BUILD FAILURE
> >> [INFO]
> >> 
> >> [INFO] Total time: 22:55 min
> >> [INFO] Finished at: 2015-09-07T22:42:57+00:00
> >> [INFO] Final Memory: 240M/455M
> >> [INFO]
> >> 
> >> [ERROR] GC overhead limit exceeded -> [Help 1]
> >> [ERROR]
> >> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> >> -e switch.
> >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >> [ERROR]
> >> [ERROR] For more information about the errors and possible solutions,
> >> please read the following articles:
> >> [ERROR] [Help 1]
> >> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
> >> + return 1
> >> + exit 1
> >
> >
> > Any thoughts would be extremely helpful.
> >
> > --Ben
>


Re: 1.5 Build Errors

2015-09-08 Thread Sean Owen
It shows you there that Maven is out of memory. Give it more heap. I use 3gb.

On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen  wrote:
> Hi All,
>
> I'm trying to build a distribution off of the latest in master and I keep
> getting errors on MQTT and the build fails.   I'm running the build on a
> m1.large which has 7.5 GB of RAM and no other major processes are running.
>
>> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
>> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz -Pyarn
>> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
>
>
>
>> INFO] Spark Project GraphX ... SUCCESS [
>> 33.345 s]
>> [INFO] Spark Project Streaming  SUCCESS [01:08
>> min]
>> [INFO] Spark Project Catalyst . SUCCESS [01:39
>> min]
>> [INFO] Spark Project SQL .. SUCCESS [02:06
>> min]
>> [INFO] Spark Project ML Library ... SUCCESS [02:16
>> min]
>> [INFO] Spark Project Tools  SUCCESS [
>> 4.087 s]
>> [INFO] Spark Project Hive . SUCCESS [01:28
>> min]
>> [INFO] Spark Project REPL . SUCCESS [
>> 16.291 s]
>> [INFO] Spark Project YARN Shuffle Service . SUCCESS [
>> 13.671 s]
>> [INFO] Spark Project YARN . SUCCESS [
>> 20.554 s]
>> [INFO] Spark Project Hive Thrift Server ... SUCCESS [
>> 14.332 s]
>> [INFO] Spark Project Assembly . SUCCESS [03:33
>> min]
>> [INFO] Spark Project External Twitter . SUCCESS [
>> 14.208 s]
>> [INFO] Spark Project External Flume Sink .. SUCCESS [
>> 11.535 s]
>> [INFO] Spark Project External Flume ... SUCCESS [
>> 19.010 s]
>> [INFO] Spark Project External Flume Assembly .. SUCCESS [
>> 5.210 s]
>> [INFO] Spark Project External MQTT  FAILURE [01:10
>> min]
>> [INFO] Spark Project External MQTT Assembly ... SKIPPED
>> [INFO] Spark Project External ZeroMQ .. SKIPPED
>> [INFO] Spark Project External Kafka ... SKIPPED
>> [INFO] Spark Project Examples . SKIPPED
>> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>> [INFO]
>> 
>> [INFO] BUILD FAILURE
>> [INFO]
>> 
>> [INFO] Total time: 22:55 min
>> [INFO] Finished at: 2015-09-07T22:42:57+00:00
>> [INFO] Final Memory: 240M/455M
>> [INFO]
>> 
>> [ERROR] GC overhead limit exceeded -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>> + return 1
>> + exit 1
>
>
> Any thoughts would be extremely helpful.
>
> --Ben

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: 1.5 Build Errors

2015-09-08 Thread Benjamin Zaitlen
I'm still getting errors with 3g.  I've increase to 4g and I'll report back

To be clear:

export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
-XX:ReservedCodeCacheSize=1024m"

[ERROR] GC overhead limit exceeded -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
> + return 1
> + exit 1


On Tue, Sep 8, 2015 at 10:03 AM, Sean Owen  wrote:

> It might need more memory in certain situations / running certain
> tests. If 3gb works for your relatively full build, yes you can open a
> PR to change any occurrences of lower recommendations to 3gb.
>
> On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen 
> wrote:
> > Ah, right.  Should've caught that.
> >
> > The docs seem to recommend 2gb.  Should that be increased as well?
> >
> > --Ben
> >
> > On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:
> >>
> >> It shows you there that Maven is out of memory. Give it more heap. I use
> >> 3gb.
> >>
> >> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
> >> wrote:
> >> > Hi All,
> >> >
> >> > I'm trying to build a distribution off of the latest in master and I
> >> > keep
> >> > getting errors on MQTT and the build fails.   I'm running the build
> on a
> >> > m1.large which has 7.5 GB of RAM and no other major processes are
> >> > running.
> >> >
> >> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m"
> >> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz
> -Pyarn
> >> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
> >> >
> >> >
> >> >
> >> >> INFO] Spark Project GraphX ... SUCCESS [
> >> >> 33.345 s]
> >> >> [INFO] Spark Project Streaming  SUCCESS
> >> >> [01:08
> >> >> min]
> >> >> [INFO] Spark Project Catalyst . SUCCESS
> >> >> [01:39
> >> >> min]
> >> >> [INFO] Spark Project SQL .. SUCCESS
> >> >> [02:06
> >> >> min]
> >> >> [INFO] Spark Project ML Library ... SUCCESS
> >> >> [02:16
> >> >> min]
> >> >> [INFO] Spark Project Tools  SUCCESS [
> >> >> 4.087 s]
> >> >> [INFO] Spark Project Hive . SUCCESS
> >> >> [01:28
> >> >> min]
> >> >> [INFO] Spark Project REPL . SUCCESS [
> >> >> 16.291 s]
> >> >> [INFO] Spark Project YARN Shuffle Service . SUCCESS [
> >> >> 13.671 s]
> >> >> [INFO] Spark Project YARN . SUCCESS [
> >> >> 20.554 s]
> >> >> [INFO] Spark Project Hive Thrift Server ... SUCCESS [
> >> >> 14.332 s]
> >> >> [INFO] Spark Project Assembly . SUCCESS
> >> >> [03:33
> >> >> min]
> >> >> [INFO] Spark Project External Twitter . SUCCESS [
> >> >> 14.208 s]
> >> >> [INFO] Spark Project External Flume Sink .. SUCCESS [
> >> >> 11.535 s]
> >> >> [INFO] Spark Project External Flume ... SUCCESS [
> >> >> 19.010 s]
> >> >> [INFO] Spark Project External Flume Assembly .. SUCCESS [
> >> >> 5.210 s]
> >> >> [INFO] Spark Project External MQTT  FAILURE
> >> >> [01:10
> >> >> min]
> >> >> [INFO] Spark Project External MQTT Assembly ... SKIPPED
> >> >> [INFO] Spark Project External ZeroMQ .. SKIPPED
> >> >> [INFO] Spark Project External Kafka ... SKIPPED
> >> >> [INFO] Spark Project Examples . SKIPPED
> >> >> [INFO] Spark Project External Kafka Assembly .. SKIPPED
> >> >> [INFO]
> >> >>
> >> >>
> 
> >> >> [INFO] BUILD FAILURE
> >> >> [INFO]
> >> >>
> >> >>
> 
> >> >> [INFO] Total time: 22:55 min
> >> >> [INFO] Finished at: 2015-09-07T22:42:57+00:00
> >> >> [INFO] Final Memory: 240M/455M
> >> >> [INFO]
> >> >>
> >> >>
> 
> >> >> [ERROR] GC overhead limit exceeded -> [Help 1]
> >> >> [ERROR]
> >> >> [ERROR] To see the full stack trace of the errors, re-run Maven with
> >> >> the
> >> >> -e switch.
> >> >> [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> >> >> [ERROR]
> >> >> [ERROR] For more information about the errors and possible solutions,
> >> >> please read the following articles:
> >> >> [ERROR] [Help 1]
> >> >> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
> >> >> + return 1
> >> >> + exit 1
> >> 

Re: 1.5 Build Errors

2015-09-08 Thread Ted Yu
Do you run Zinc while compiling ?

Cheers

On Tue, Sep 8, 2015 at 7:56 AM, Benjamin Zaitlen  wrote:

> I'm still getting errors with 3g.  I've increase to 4g and I'll report back
>
> To be clear:
>
> export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
> -XX:ReservedCodeCacheSize=1024m"
>
> [ERROR] GC overhead limit exceeded -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>> + return 1
>> + exit 1
>
>
> On Tue, Sep 8, 2015 at 10:03 AM, Sean Owen  wrote:
>
>> It might need more memory in certain situations / running certain
>> tests. If 3gb works for your relatively full build, yes you can open a
>> PR to change any occurrences of lower recommendations to 3gb.
>>
>> On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen 
>> wrote:
>> > Ah, right.  Should've caught that.
>> >
>> > The docs seem to recommend 2gb.  Should that be increased as well?
>> >
>> > --Ben
>> >
>> > On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:
>> >>
>> >> It shows you there that Maven is out of memory. Give it more heap. I
>> use
>> >> 3gb.
>> >>
>> >> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
>> >> wrote:
>> >> > Hi All,
>> >> >
>> >> > I'm trying to build a distribution off of the latest in master and I
>> >> > keep
>> >> > getting errors on MQTT and the build fails.   I'm running the build
>> on a
>> >> > m1.large which has 7.5 GB of RAM and no other major processes are
>> >> > running.
>> >> >
>> >> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
>> -XX:ReservedCodeCacheSize=512m"
>> >> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz
>> -Pyarn
>> >> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
>> >> >
>> >> >
>> >> >
>> >> >> INFO] Spark Project GraphX ... SUCCESS [
>> >> >> 33.345 s]
>> >> >> [INFO] Spark Project Streaming  SUCCESS
>> >> >> [01:08
>> >> >> min]
>> >> >> [INFO] Spark Project Catalyst . SUCCESS
>> >> >> [01:39
>> >> >> min]
>> >> >> [INFO] Spark Project SQL .. SUCCESS
>> >> >> [02:06
>> >> >> min]
>> >> >> [INFO] Spark Project ML Library ... SUCCESS
>> >> >> [02:16
>> >> >> min]
>> >> >> [INFO] Spark Project Tools  SUCCESS
>> [
>> >> >> 4.087 s]
>> >> >> [INFO] Spark Project Hive . SUCCESS
>> >> >> [01:28
>> >> >> min]
>> >> >> [INFO] Spark Project REPL . SUCCESS
>> [
>> >> >> 16.291 s]
>> >> >> [INFO] Spark Project YARN Shuffle Service . SUCCESS
>> [
>> >> >> 13.671 s]
>> >> >> [INFO] Spark Project YARN . SUCCESS
>> [
>> >> >> 20.554 s]
>> >> >> [INFO] Spark Project Hive Thrift Server ... SUCCESS
>> [
>> >> >> 14.332 s]
>> >> >> [INFO] Spark Project Assembly . SUCCESS
>> >> >> [03:33
>> >> >> min]
>> >> >> [INFO] Spark Project External Twitter . SUCCESS
>> [
>> >> >> 14.208 s]
>> >> >> [INFO] Spark Project External Flume Sink .. SUCCESS
>> [
>> >> >> 11.535 s]
>> >> >> [INFO] Spark Project External Flume ... SUCCESS
>> [
>> >> >> 19.010 s]
>> >> >> [INFO] Spark Project External Flume Assembly .. SUCCESS
>> [
>> >> >> 5.210 s]
>> >> >> [INFO] Spark Project External MQTT  FAILURE
>> >> >> [01:10
>> >> >> min]
>> >> >> [INFO] Spark Project External MQTT Assembly ... SKIPPED
>> >> >> [INFO] Spark Project External ZeroMQ .. SKIPPED
>> >> >> [INFO] Spark Project External Kafka ... SKIPPED
>> >> >> [INFO] Spark Project Examples . SKIPPED
>> >> >> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>> >> >> [INFO]
>> >> >>
>> >> >>
>> 
>> >> >> [INFO] BUILD FAILURE
>> >> >> [INFO]
>> >> >>
>> >> >>
>> 
>> >> >> [INFO] Total time: 22:55 min
>> >> >> [INFO] Finished at: 2015-09-07T22:42:57+00:00
>> >> >> [INFO] Final Memory: 240M/455M
>> >> >> [INFO]
>> >> >>
>> >> >>
>> 
>> >> >> [ERROR] GC overhead limit exceeded -> [Help 1]
>> >> >> [ERROR]
>> >> >> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> >> >> the
>> >> >> -e switch.
>> >> >> [ERROR] Re-run Maven using the -X switch to enable full debug
>> 

Re: 1.5 Build Errors

2015-09-08 Thread Sean Owen
It might need more memory in certain situations / running certain
tests. If 3gb works for your relatively full build, yes you can open a
PR to change any occurrences of lower recommendations to 3gb.

On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen  wrote:
> Ah, right.  Should've caught that.
>
> The docs seem to recommend 2gb.  Should that be increased as well?
>
> --Ben
>
> On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:
>>
>> It shows you there that Maven is out of memory. Give it more heap. I use
>> 3gb.
>>
>> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
>> wrote:
>> > Hi All,
>> >
>> > I'm trying to build a distribution off of the latest in master and I
>> > keep
>> > getting errors on MQTT and the build fails.   I'm running the build on a
>> > m1.large which has 7.5 GB of RAM and no other major processes are
>> > running.
>> >
>> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
>> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz -Pyarn
>> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
>> >
>> >
>> >
>> >> INFO] Spark Project GraphX ... SUCCESS [
>> >> 33.345 s]
>> >> [INFO] Spark Project Streaming  SUCCESS
>> >> [01:08
>> >> min]
>> >> [INFO] Spark Project Catalyst . SUCCESS
>> >> [01:39
>> >> min]
>> >> [INFO] Spark Project SQL .. SUCCESS
>> >> [02:06
>> >> min]
>> >> [INFO] Spark Project ML Library ... SUCCESS
>> >> [02:16
>> >> min]
>> >> [INFO] Spark Project Tools  SUCCESS [
>> >> 4.087 s]
>> >> [INFO] Spark Project Hive . SUCCESS
>> >> [01:28
>> >> min]
>> >> [INFO] Spark Project REPL . SUCCESS [
>> >> 16.291 s]
>> >> [INFO] Spark Project YARN Shuffle Service . SUCCESS [
>> >> 13.671 s]
>> >> [INFO] Spark Project YARN . SUCCESS [
>> >> 20.554 s]
>> >> [INFO] Spark Project Hive Thrift Server ... SUCCESS [
>> >> 14.332 s]
>> >> [INFO] Spark Project Assembly . SUCCESS
>> >> [03:33
>> >> min]
>> >> [INFO] Spark Project External Twitter . SUCCESS [
>> >> 14.208 s]
>> >> [INFO] Spark Project External Flume Sink .. SUCCESS [
>> >> 11.535 s]
>> >> [INFO] Spark Project External Flume ... SUCCESS [
>> >> 19.010 s]
>> >> [INFO] Spark Project External Flume Assembly .. SUCCESS [
>> >> 5.210 s]
>> >> [INFO] Spark Project External MQTT  FAILURE
>> >> [01:10
>> >> min]
>> >> [INFO] Spark Project External MQTT Assembly ... SKIPPED
>> >> [INFO] Spark Project External ZeroMQ .. SKIPPED
>> >> [INFO] Spark Project External Kafka ... SKIPPED
>> >> [INFO] Spark Project Examples . SKIPPED
>> >> [INFO] Spark Project External Kafka Assembly .. SKIPPED
>> >> [INFO]
>> >>
>> >> 
>> >> [INFO] BUILD FAILURE
>> >> [INFO]
>> >>
>> >> 
>> >> [INFO] Total time: 22:55 min
>> >> [INFO] Finished at: 2015-09-07T22:42:57+00:00
>> >> [INFO] Final Memory: 240M/455M
>> >> [INFO]
>> >>
>> >> 
>> >> [ERROR] GC overhead limit exceeded -> [Help 1]
>> >> [ERROR]
>> >> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> >> the
>> >> -e switch.
>> >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> >> [ERROR]
>> >> [ERROR] For more information about the errors and possible solutions,
>> >> please read the following articles:
>> >> [ERROR] [Help 1]
>> >> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>> >> + return 1
>> >> + exit 1
>> >
>> >
>> > Any thoughts would be extremely helpful.
>> >
>> > --Ben
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: 1.5 Build Errors

2015-09-08 Thread Benjamin Zaitlen
Yes, just reran with the following

(spark_build)root@ip-10-45-130-206:~/spark# export MAVEN_OPTS="-Xmx4096mb
> -XX:MaxPermSize=1024M -XX:ReservedCodeCacheSize=1024m"
> (spark_build)root@ip-10-45-130-206:~/spark# build/mvn -Pyarn -Phadoop-2.4
> -Dhadoop.version=2.4.0 -DskipTests clean package


and grepping for java


root   641  9.9  0.3 4411732 49040 pts/4   Sl+  17:35   0:01
> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
> com.typesafe.zinc.Nailgun 3030 0
> root   687  226  2.0 1803664 312876 pts/4  Sl+  17:36   0:22
> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
> -Dmaven.multiModuleProjectDirectory=/root/spark
> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 -Pyarn
> -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package


On Tue, Sep 8, 2015 at 1:14 PM, Sean Owen  wrote:

> MAVEN_OPTS shouldn't affect zinc as it's an unrelated application. You
> can run "zinc -J-Xmx4g..." in general, but in the provided script,
> ZINC_OPTS seems to be the equivalent, yes. It kind of looks like your
> mvn process isn't getting any special memory args there. Is MAVEN_OPTS
> really exported?
>
> FWIW I use my own local mvn and zinc and it works fine.
>
> On Tue, Sep 8, 2015 at 6:05 PM, Benjamin Zaitlen 
> wrote:
> > I'm running zinv while compiling.  It seems that MAVEN_OPTS doesn't
> really
> > change much?  Or perhaps I'm misunderstanding something -- grepping for
> java
> > i see
> >
> >> root 24355  102  8.8 4687376 1350724 pts/4 Sl   16:51  11:08
> >> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
> >> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
> >> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
> >>
> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
> >> com.typesafe.zinc.Nailgun 3030 0
> >> root 25151 22.0  3.2 2269092 495276 pts/4  Sl+  16:53   1:56
> >> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
> >>
> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
> >>
> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
> >> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
> >> -Dmaven.multiModuleProjectDirectory=/root/spark
> >> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 clean
> >> package -DskipTests -Pyarn -Phive -Phive-thriftserver -Phadoop-2.4
> >> -Dhadoop.version=2.4.0
> >
> >
> > So the heap size is still 2g even with MAVEN_OPTS set with 4g.  I noticed
> > that within build/mvn _COMPILE_JVM_OPTS is set to 2g and this is what
> > ZINC_OPTS is set to.
> >
> > --Ben
> >
> >
> > On Tue, Sep 8, 2015 at 11:06 AM, Ted Yu  wrote:
> >>
> >> Do you run Zinc while compiling ?
> >>
> >> Cheers
> >>
> >> On Tue, Sep 8, 2015 at 7:56 AM, Benjamin Zaitlen 
> >> wrote:
> >>>
> >>> I'm still getting errors with 3g.  I've increase to 4g and I'll report
> >>> back
> >>>
> >>> To be clear:
> >>>
> >>> export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
> >>> -XX:ReservedCodeCacheSize=1024m"
> >>>
>  [ERROR] GC overhead limit exceeded -> [Help 1]
>  [ERROR]
>  [ERROR] To see the full stack trace of the errors, re-run Maven with
> the
>  -e switch.
>  [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>  [ERROR]
>  [ERROR] For more information about the errors and possible solutions,
>  please read the following articles:
>  [ERROR] [Help 1]
>  http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>  + return 1
>  + exit 1
> >>>
> >>>
> >>> On Tue, Sep 8, 2015 at 10:03 AM, Sean Owen  wrote:
> 
>  It might need more memory in certain 

Re: 1.5 Build Errors

2015-09-08 Thread Benjamin Zaitlen
I'm running zinv while compiling.  It seems that MAVEN_OPTS doesn't really
change much?  Or perhaps I'm misunderstanding something -- grepping for
java i see

root 24355  102  8.8 4687376 1350724 pts/4 Sl   16:51  11:08
> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
> com.typesafe.zinc.Nailgun 3030 0
> root 25151 22.0  3.2 2269092 495276 pts/4  Sl+  16:53   1:56
> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
> -Dmaven.multiModuleProjectDirectory=/root/spark
> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 clean
> package -DskipTests -Pyarn -Phive -Phive-thriftserver -Phadoop-2.4
> -Dhadoop.version=2.4.0


So the heap size is still 2g even with MAVEN_OPTS set with 4g.  I noticed
that within build/mvn _COMPILE_JVM_OPTS is set to 2g and this is what
ZINC_OPTS is set to.

--Ben


On Tue, Sep 8, 2015 at 11:06 AM, Ted Yu  wrote:

> Do you run Zinc while compiling ?
>
> Cheers
>
> On Tue, Sep 8, 2015 at 7:56 AM, Benjamin Zaitlen 
> wrote:
>
>> I'm still getting errors with 3g.  I've increase to 4g and I'll report
>> back
>>
>> To be clear:
>>
>> export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
>> -XX:ReservedCodeCacheSize=1024m"
>>
>> [ERROR] GC overhead limit exceeded -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
>>> + return 1
>>> + exit 1
>>
>>
>> On Tue, Sep 8, 2015 at 10:03 AM, Sean Owen  wrote:
>>
>>> It might need more memory in certain situations / running certain
>>> tests. If 3gb works for your relatively full build, yes you can open a
>>> PR to change any occurrences of lower recommendations to 3gb.
>>>
>>> On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen 
>>> wrote:
>>> > Ah, right.  Should've caught that.
>>> >
>>> > The docs seem to recommend 2gb.  Should that be increased as well?
>>> >
>>> > --Ben
>>> >
>>> > On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:
>>> >>
>>> >> It shows you there that Maven is out of memory. Give it more heap. I
>>> use
>>> >> 3gb.
>>> >>
>>> >> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
>>> >> wrote:
>>> >> > Hi All,
>>> >> >
>>> >> > I'm trying to build a distribution off of the latest in master and I
>>> >> > keep
>>> >> > getting errors on MQTT and the build fails.   I'm running the build
>>> on a
>>> >> > m1.large which has 7.5 GB of RAM and no other major processes are
>>> >> > running.
>>> >> >
>>> >> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
>>> -XX:ReservedCodeCacheSize=512m"
>>> >> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz
>>> -Pyarn
>>> >> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
>>> >> >
>>> >> >
>>> >> >
>>> >> >> INFO] Spark Project GraphX ... SUCCESS
>>> [
>>> >> >> 33.345 s]
>>> >> >> [INFO] Spark Project Streaming  SUCCESS
>>> >> >> [01:08
>>> >> >> min]
>>> >> >> [INFO] Spark Project Catalyst . SUCCESS
>>> >> >> [01:39
>>> >> >> min]
>>> >> >> [INFO] Spark Project SQL .. SUCCESS
>>> >> >> [02:06
>>> >> >> min]
>>> >> >> [INFO] Spark Project ML Library ... SUCCESS
>>> >> >> [02:16
>>> >> >> min]
>>> >> >> [INFO] Spark Project Tools 
>>> SUCCESS [
>>> >> >> 4.087 s]
>>> >> >> [INFO] Spark Project Hive . SUCCESS
>>> >> >> [01:28
>>> >> >> min]
>>> >> >> [INFO] Spark Project REPL .
>>> SUCCESS [
>>> >> >> 16.291 s]
>>> >> >> [INFO] Spark Project YARN Shuffle Service .
>>> SUCCESS [
>>> >> >> 13.671 s]
>>> >> >> [INFO] Spark Project YARN .
>>> SUCCESS [
>>> >> >> 20.554 s]
>>> >> >> [INFO] Spark 

Re: 1.5 Build Errors

2015-09-08 Thread Sean Owen
MAVEN_OPTS shouldn't affect zinc as it's an unrelated application. You
can run "zinc -J-Xmx4g..." in general, but in the provided script,
ZINC_OPTS seems to be the equivalent, yes. It kind of looks like your
mvn process isn't getting any special memory args there. Is MAVEN_OPTS
really exported?

FWIW I use my own local mvn and zinc and it works fine.

On Tue, Sep 8, 2015 at 6:05 PM, Benjamin Zaitlen  wrote:
> I'm running zinv while compiling.  It seems that MAVEN_OPTS doesn't really
> change much?  Or perhaps I'm misunderstanding something -- grepping for java
> i see
>
>> root 24355  102  8.8 4687376 1350724 pts/4 Sl   16:51  11:08
>> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -server -Xmx2g
>> -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>> -Dzinc.home=/root/spark/build/zinc-0.3.5.3 -classpath
>> /root/spark/build/zinc-0.3.5.3/lib/compiler-interface-sources.jar:/root/spark/build/zinc-0.3.5.3/lib/incremental-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/nailgun-server.jar:/root/spark/build/zinc-0.3.5.3/lib/sbt-interface.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-compiler.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-library.jar:/root/spark/build/zinc-0.3.5.3/lib/scala-reflect.jar:/root/spark/build/zinc-0.3.5.3/lib/zinc.jar
>> com.typesafe.zinc.Nailgun 3030 0
>> root 25151 22.0  3.2 2269092 495276 pts/4  Sl+  16:53   1:56
>> /usr/lib/jvm/java-7-openjdk-amd64/bin/java -Xms256m -Xmx512m -classpath
>> /opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar
>> -Dclassworlds.conf=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3/bin/m2.conf
>> -Dmaven.home=/opt/anaconda/envs/spark_build/share/apache-maven-3.3.3
>> -Dmaven.multiModuleProjectDirectory=/root/spark
>> org.codehaus.plexus.classworlds.launcher.Launcher -DzincPort=3030 clean
>> package -DskipTests -Pyarn -Phive -Phive-thriftserver -Phadoop-2.4
>> -Dhadoop.version=2.4.0
>
>
> So the heap size is still 2g even with MAVEN_OPTS set with 4g.  I noticed
> that within build/mvn _COMPILE_JVM_OPTS is set to 2g and this is what
> ZINC_OPTS is set to.
>
> --Ben
>
>
> On Tue, Sep 8, 2015 at 11:06 AM, Ted Yu  wrote:
>>
>> Do you run Zinc while compiling ?
>>
>> Cheers
>>
>> On Tue, Sep 8, 2015 at 7:56 AM, Benjamin Zaitlen 
>> wrote:
>>>
>>> I'm still getting errors with 3g.  I've increase to 4g and I'll report
>>> back
>>>
>>> To be clear:
>>>
>>> export MAVEN_OPTS="-Xmx4g -XX:MaxPermSize=1024M
>>> -XX:ReservedCodeCacheSize=1024m"
>>>
 [ERROR] GC overhead limit exceeded -> [Help 1]
 [ERROR]
 [ERROR] To see the full stack trace of the errors, re-run Maven with the
 -e switch.
 [ERROR] Re-run Maven using the -X switch to enable full debug logging.
 [ERROR]
 [ERROR] For more information about the errors and possible solutions,
 please read the following articles:
 [ERROR] [Help 1]
 http://cwiki.apache.org/confluence/display/MAVEN/OutOfMemoryError
 + return 1
 + exit 1
>>>
>>>
>>> On Tue, Sep 8, 2015 at 10:03 AM, Sean Owen  wrote:

 It might need more memory in certain situations / running certain
 tests. If 3gb works for your relatively full build, yes you can open a
 PR to change any occurrences of lower recommendations to 3gb.

 On Tue, Sep 8, 2015 at 3:02 PM, Benjamin Zaitlen 
 wrote:
 > Ah, right.  Should've caught that.
 >
 > The docs seem to recommend 2gb.  Should that be increased as well?
 >
 > --Ben
 >
 > On Tue, Sep 8, 2015 at 9:33 AM, Sean Owen  wrote:
 >>
 >> It shows you there that Maven is out of memory. Give it more heap. I
 >> use
 >> 3gb.
 >>
 >> On Tue, Sep 8, 2015 at 1:53 PM, Benjamin Zaitlen 
 >> wrote:
 >> > Hi All,
 >> >
 >> > I'm trying to build a distribution off of the latest in master and
 >> > I
 >> > keep
 >> > getting errors on MQTT and the build fails.   I'm running the build
 >> > on a
 >> > m1.large which has 7.5 GB of RAM and no other major processes are
 >> > running.
 >> >
 >> >> MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
 >> >> -XX:ReservedCodeCacheSize=512m"
 >> >> ./make-distribution.sh  --name continuum-custom-spark-1.5 --tgz
 >> >> -Pyarn
 >> >> -Phive -Phive-thriftserver -Phadoop-2.4 -Dhadoop.version=2.4.0
 >> >
 >> >
 >> >
 >> >> INFO] Spark Project GraphX ... SUCCESS
 >> >> [
 >> >> 33.345 s]
 >> >> [INFO] Spark Project Streaming 
 >> >> SUCCESS
 >> >> [01:08
 >> >> min]
 >> >> [INFO] Spark Project Catalyst .
 >> >> SUCCESS
 >> >> [01:39
 >> >> min]
 >> >> [INFO] Spark Project SQL ..
 >> >> SUCCESS
 >> >> [02:06
 >> >> min]
 >> >> [INFO] Spark Project ML