gt; but its virtual memory has ballooned to a monstrous 332GB. Does that ring
> any bell? Can you run regular java applications on this node? This doesn't
> seem related to YARN per-se.
>
> +Vinod
> Hortonworks Inc.
> http://hortonworks.com/
>
>
> On
configure-yarn-in-hdp-2-0/
>
>
>
> 2013/12/5 panfei
>
>> we have already tried several values of these two parameters, but it
>> seems no use.
>>
>>
>> 2013/12/5 Tsuyoshi OZAWA
>>
>>> Hi,
>>>
>>> Please check the prop
appers/reducers.
>
> On Wed, Dec 4, 2013 at 10:16 PM, panfei wrote:
> >
> >
> > -- Forwarded message --
> > From: panfei
> > Date: 2013/12/4
> > Subject: Container
> > [pid=22885,containerID=container_138615044_0001_01_13] is runn
-- Forwarded message --
From: panfei
Date: 2013/12/4
Subject: Container
[pid=22885,containerID=container_138615044_0001_01_13] is running
beyond physical memory limits. Current usage: 1.0 GB of 1 GB physical
memory used; 332.5 GB of 8 GB virtual memory used. Killing
/12/4 Haitao Yao
> The firewall is OK.
> Well, personally I prefer Pig. And it's a big project, switching pig to
> hive is not an easy way.
> thanks.
>
> Haitao Yao
> yao.e...@gmail.com
> weibo: @haitao_yao
> Skype: haitao.yao.final
>
> On 2012-12-4, at
check your firewall settings plz. and why not use hive to do work ?
2012/12/4 Haitao Yao
> hi, all
> I's using Hadoop 1.2.0 , java version "1.7.0_05"
> When running my pig script , the worker always report this error, and the
> MR jobs run very slow.
> Increase the dfs.socket.timeout value do