The default task memory allocation size is set in the hadoop-default.xml
file for your configuration and is usually
The parameter is mapred.child.java.opts, and the value is generally
-Xmx200m.
You may alter this value in your JobConf object before you submit the job
and the individual tasks will
Hi, Amandeep,
I've copied following lines from a site:
--
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
This can have two reasons:
* Your Java application has a memory leak. There are tools like
YourKit Java Profiler that help you to identify such leaks.
*
I'm getting the following error while running my hadoop job:
09/02/06 15:33:03 INFO mapred.JobClient: Task Id :
attempt_200902061333_0004_r_00_1, Status : FAILED
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.lang.AbstractStringBu