I encountered the OOM problem, because i don't set ulimit open files limit.
It had nothing to do with Memory. Memory is sufficient.

Best Regards,
Andy

2012/12/22 Manoj Babu <manoj...@gmail.com>

> David,
>
> I faced the same issue due to too much of logging that fills the task
> tracker log folder.
>
> Cheers!
> Manoj.
>
>
> On Sat, Dec 22, 2012 at 9:10 PM, Stephen Fritz <steph...@cloudera.com>wrote:
>
>> Troubleshooting OOMs in the map/reduce tasks can be tricky, see page 118
>> of Hadoop 
>> Operations<http://books.google.com/books?id=W5VWrrCOuQ8C&pg=PA123&lpg=PA123&dq=mapred+child+address+space+size&source=bl&ots=PCdqGFbU-Z&sig=ArgpJroU7UEmMqMB_hwXoCq7whk&hl=en&sa=X&ei=TNPVUMjjHsS60AGHtoHQDA&ved=0CEUQ6AEwAw#v=onepage&q=mapred%20child%20address%20space%20size&f=false>for
>>  a couple of settings which could affect the frequency of OOMs which
>> aren't necessarily intuitive.
>>
>> To answer your question about getting the heap dump, you should be able
>> to add "-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/some/path" to
>> your mapred.child.java.opts, then look for the heap dump in that path next
>> time you see the OOM.
>>
>>
>> On Fri, Dec 21, 2012 at 11:33 PM, David Parks <davidpark...@yahoo.com>wrote:
>>
>>> I’m pretty consistently seeing a few reduce tasks fail with
>>> OutOfMemoryError (below). It doesn’t kill the job, but it slows it down.
>>> ****
>>>
>>> ** **
>>>
>>> In my current case the reducer is pretty darn simple, the algorithm
>>> basically does:****
>>>
>>> **1.       **Do you have 2 values for this key?****
>>>
>>> **2.       **If so, build a json string and emit a NullWritable and
>>> Text value.****
>>>
>>> ** **
>>>
>>> The string buffer I use to build the json is re-used, and I can’t see
>>> anywhere in my code that would be taking more than ~50k of memory at any
>>> point in time.****
>>>
>>> ** **
>>>
>>> But I want to verify, is there a way to get the heap dump and all after
>>> this error? I’m running on AWS MapReduce v1.0.3 of Hadoop.****
>>>
>>> ** **
>>>
>>> Error: java.lang.OutOfMemoryError: Java heap space****
>>>
>>>         at
>>> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.shuffleInMemory(ReduceTask.java:1711)
>>> ****
>>>
>>>         at
>>> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.getMapOutput(ReduceTask.java:1571)
>>> ****
>>>
>>>         at
>>> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.copyOutput(ReduceTask.java:1412)
>>> ****
>>>
>>>         at
>>> org.apache.hadoop.mapred.ReduceTask$ReduceCopier$MapOutputCopier.run(ReduceTask.java:1344)
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>
>>
>

Reply via email to