Hi All

In my case, this problem can be solved by adding like
"-Dmapreduce.map.memory.mb=14848" in command line. Just need to make sure
there is no other soft/hard memory limit for your account.

Thanks for help.

Best,

Hai



On Sat, Nov 5, 2016 at 2:46 AM, Panagiotis Liakos <p.lia...@di.uoa.gr>
wrote:

> Hi all,
>
> This property in hadoop/conf/mapred-site.xml works for me:
>
> <property>
> <name>mapred.map.child.java.opts</name>
> <value>-Xmx10g</value>
> </property>
>
> Regards,
> Panagiotis
>
> 2016-11-04 23:11 GMT+02:00 Xenia Demetriou <xenia...@gmail.com>:
> > Hi,
> > I have the same problem and I add the following  in
> > mapred-site.xml and hadoop-env.sh but I still have the same problem.
> > I try various values below but nothhing increase the memory.
> >
> > mapred-site.xml:
> > <property>
> >     <name>mapred.child.java.opts</name>
> >     <value>-Xms256m </value>
> >     <value>-Xmx4096m</value>
> > </property>
> >
> > hadoop-env.sh:
> > export HADOOP_HEAPSIZE=3072
> > export HADOOP_OPTS="-Xmx4096m"
> >
> >
>

Reply via email to