Re: OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread Anand Avati
Hi Jay,
The recommended way to build spark from source is through the maven system.
You would want to follow the steps in
https://spark.apache.org/docs/latest/building-with-maven.html to set the
MAVEN_OPTS to prevent OOM build errors.

Thanks


On Tue, Aug 26, 2014 at 5:49 PM, jay vyas 
wrote:

> Hi spark.
>
> I've been trying to build spark, but I've been getting lots of oome
> exceptions.
>
> https://gist.github.com/jayunit100/d424b6b825ce8517d68c
>
> For the most part, they are of the form:
>
> java.lang.OutOfMemoryError: unable to create new native thread
>
> I've attempted to hard code the "get_mem_opts" function, which is in the
> sbt-launch-lib.bash file, to
> have various very high parameter sizes (i.e. -Xms5g") with high
> MaxPermSize, etc... and to no avail.
>
> Any thoughts on this would be appreciated.
>
> I know of others having the same problem as well.
>
> Thanks!
>
> --
> jay vyas
>


Re: OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread Jay Vyas
Thanks...! Some questions below.

1) you are suggesting that maybe this OOME is a symptom/red herring , and the 
true cause of it is that a thread can't span because of ulimit... If so 
possibly this could be flagged early on in the build.  And -- where are so many 
threads coming from that I need to up my limit?   Is this a new feature added 
to spark recently, and if so will it effect deployments scenarios as well?

And 

2) possibly SBT_OPTS is where the memory settings should be ? If so, then why 
do we have the get_mem_opts wrapper function coded to send memory manually as 
Xmx/Xms options?
  execRunner "$java_cmd" \
${SBT_OPTS:-$default_sbt_opts} \
$(get_mem_opts $sbt_mem) \
${java_opts} \
${java_args[@]} \
-jar "$sbt_jar" \
"${sbt_commands[@]}" \
"${residual_args[@]}"



> On Aug 26, 2014, at 8:58 PM, Mubarak Seyed  wrote:
> 
> What is your ulimit value?
> 
> 
>> On Tue, Aug 26, 2014 at 5:49 PM, jay vyas  
>> wrote:
>> Hi spark.
>> 
>> I've been trying to build spark, but I've been getting lots of oome
>> exceptions.
>> 
>> https://gist.github.com/jayunit100/d424b6b825ce8517d68c
>> 
>> For the most part, they are of the form:
>> 
>> java.lang.OutOfMemoryError: unable to create new native thread
>> 
>> I've attempted to hard code the "get_mem_opts" function, which is in the
>> sbt-launch-lib.bash file, to
>> have various very high parameter sizes (i.e. -Xms5g") with high
>> MaxPermSize, etc... and to no avail.
>> 
>> Any thoughts on this would be appreciated.
>> 
>> I know of others having the same problem as well.
>> 
>> Thanks!
>> 
>> --
>> jay vyas
> 


Re: OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread Mubarak Seyed
What is your ulimit value?


On Tue, Aug 26, 2014 at 5:49 PM, jay vyas 
wrote:

> Hi spark.
>
> I've been trying to build spark, but I've been getting lots of oome
> exceptions.
>
> https://gist.github.com/jayunit100/d424b6b825ce8517d68c
>
> For the most part, they are of the form:
>
> java.lang.OutOfMemoryError: unable to create new native thread
>
> I've attempted to hard code the "get_mem_opts" function, which is in the
> sbt-launch-lib.bash file, to
> have various very high parameter sizes (i.e. -Xms5g") with high
> MaxPermSize, etc... and to no avail.
>
> Any thoughts on this would be appreciated.
>
> I know of others having the same problem as well.
>
> Thanks!
>
> --
> jay vyas
>