Hi guys :

I have a map/r job that has always worked fine, but which fails due to a
heap space error on my local machine during unit tests.

It runs in hadoop's default mode, and just fails durring the constructor of
the MapOutputBuffer.... Any thoughts on why ?

I dont do any custom memory settings in by unit tests, because they aren't
really needed --- So I assume this is related to /tmp files
or something ... but cant track down the issue.

Any thoughts would be very much appreciated ..

12/05/01 19:15:53 WARN mapred.LocalJobRunner: job_local_0002
java.lang.OutOfMemoryError: Java heap space
    at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:807)
    at
org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(MapTask.java:557)




-- 
Jay Vyas
MMSB/UCHC

Reply via email to