You seem to have it right - why not give it a try before asking? :)
On Mon, Jan 6, 2014 at 10:42 AM, unmesha sreeveni wrote:
> Given a csv file with numeric data.
> Need to find maximum element of each column.
> What will the map and reduce.
>
> a Guess: pass each column to map()
>
Given a csv file with numeric data.
Need to find maximum element of each column.
What will the map and reduce.
a Guess: pass each column to map()
Find max in Reduce()
--
*Thanks & Regards*
Unmesha Sreeveni U.B
Junior Developer
http://www.unmeshasreeveni.blogspot.in/
There is another thread on 2.2 patch release:
http://search-hadoop.com/m/xHm23dcjsT/2.2+patch+release&subj=Re+Will+there+be+a+2+2+patch+releases+
Why not voice your opinion there ?
Cheers
On Sun, Jan 5, 2014 at 1:24 PM, John Lilley wrote:
> Thanks, I missed the target 2.4.0 release. For 2.2.
Thanks, I missed the target 2.4.0 release. For 2.2.0, is there any way to
reach the individual task container logs?
John
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Saturday, January 04, 2014 10:47 AM
To: common-u...@hadoop.apache.org
Subject: Re: YARN log access
YARN-649 is targeted at 2.4
Fantastic. Just what I needed to know.
john
From: Hardik Pandya [mailto:smarty.ju...@gmail.com]
Sent: Saturday, January 04, 2014 11:15 AM
To: user@hadoop.apache.org
Subject: Re: LocalResource size/time limits
May be this would clarify some aspect of your questions
Resource Localization in YARN
Every failed task typically carries a diagnostic message and a set of
logs for you to investigate what caused it to fail. Try visiting the
task's logs on the JT UI by clicking through individual failed
attempts to find the reason of its failure.
On Sun, Jan 5, 2014 at 11:03 PM, Saeed Adel Mehraban
Vinod,
Thanks for your reply.
1. If I understand you correct you are asking me to change the memory
allocation for each map and reduce tasks , isnt this related to the
physical memory which is not an issue(with in limits) in my application ?
The problem I am facing is with the virtual memory.
2.
Hi German, Thanks for your reply!
a) Yes setting the property yarn.nodemanager.vmem-check-enabled to false seems
to have avoid the problem.
b) I woud want to set the pmem/vmem ratio to a higher value and keep the
virtual memory with in certain limits but , changing this value is not
having any ef
Hi all,
My task jobs are failing due to many failed maps. I want to know what makes
a map to fail? Is it something like exceptions or what?
hi,
you have to configure the hive to any reporting tool, first u need to
change the meta store of hive to mysql and configure with the port 1 to
any reporting tool.
On Sun, Jan 5, 2014 at 8:50 PM, chandu banavaram wrote:
> hi experts,
> plz clarifies the following doubt i am a hadoop lear
hi experts,
plz clarifies the following doubt i am a hadoop learner
how to generate the hive reports
with regards,
chandu.
11 matches
Mail list logo