Thanks Sean! My novice understanding is that the 'native heap' is the
address space not allocated to the JVM heap, but I wanted to check to see
if I'm missing something.  I found out my issue appeared to be actual
memory pressure on the executor machine.  There was space for the JVM heap
but not much more.

On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen <so...@cloudera.com
<javascript:;>> wrote:
> No, but, the JVM also does not allocate memory for native code on the
heap.
> I dont think heap has any bearing on whether your native code can't
allocate
> more memory except that of course the heap is also taking memory.
>
> On Oct 30, 2014 6:43 PM, "Paul Wais" <pw...@yelp.com <javascript:;>>
wrote:
>>
>> Dear Spark List,
>>
>> I have a Spark app that runs native code inside map functions.  I've
>> noticed that the native code sometimes sets errno to ENOMEM indicating
>> a lack of available memory.  However, I've verified that the /JVM/ has
>> plenty of heap space available-- Runtime.getRuntime().freeMemory()
>> shows gigabytes free and the native code needs only megabytes.  Does
>> spark limit the /native/ heap size somehow?  Am poking through the
>> executor code now but don't see anything obvious.
>>
>> Best Regards,
>> -Paul Wais
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org <javascript:;>
>> For additional commands, e-mail: user-h...@spark.apache.org
<javascript:;>
>>
>

Reply via email to