No idea if your 1.6 GB serialized size HashMap can fit within a 4 GB
heap or not but the properties you're using are invalid. Both
mapred.map.child.opts and mapred.reduce.child.opts are invalid and
must instead be (if your version has it): mapred.map.child.java.opts
and mapred.reduce.child.java.opts.

On Wed, Mar 20, 2013 at 12:10 PM, Abhishek Shivkumar
<abhisheksgum...@gmail.com> wrote:
> Hi,
>
>     I have a setup() method in the Mapper.java class where I am reading in a
> 1.6 GB HashMap that was serialized into a file and stored in HDFS. When I am
> running the job, it gets stuck at the readobject() method that reads this
> serialized file into a HashMap.
>
> I increased the heap size both by doing export HADOOP_HEAPSIZE=4096 and also
> writing conf.set("mapred.map.child.opts", "-Xmx4096M); and
> conf.set("mapred.reduce.child.opts", "-Xmx4096M);
>
> It still doesn't help. Should we do something else? If I enter the
> HADOOP_HEAPSIZE beyond this, it doesn't run the hadoop command and fails to
> instantiate a JVM.
>
> Any comments would be appreciated!
>
> Thank you!
>
> With Regards,
> Abhishek S



-- 
Harsh J

Reply via email to