[ 
http://issues.apache.org/jira/browse/HADOOP-817?page=comments#action_12458930 ] 
            
Sanjay Dahiya commented on HADOOP-817:
--------------------------------------

tracked it down to MergeQueue.merge(), after a couple of iterations of that 
loop it runs out of memory. 

> Streaming reducers throw OutOfMemory for not so large inputs
> ------------------------------------------------------------
>
>                 Key: HADOOP-817
>                 URL: http://issues.apache.org/jira/browse/HADOOP-817
>             Project: Hadoop
>          Issue Type: Bug
>          Components: contrib/streaming
>            Reporter: Sanjay Dahiya
>
> I am seeing OutOfMemoryError for moderate size inputs (~70 text files, 20k 
> each ) causing job to fail in streaming. For very small inputs it still 
> succeeds. Looking into details. 

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: 
http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to