Yes, how can we use "hadoop job" to get MR job stats, especially
constituent task finish times?


On Thu, Apr 5, 2012 at 9:02 AM, Jay Vyas <jayunit...@gmail.com> wrote:

> (excuse the typo in the last email : I meant "I've been playing with Cinch"
> , not "I've been with Cinch")....
>
> On Thu, Apr 5, 2012 at 7:54 AM, Jay Vyas <jayunit...@gmail.com> wrote:
>
> > How can "hadoop job" be used to read m/r statistics ?
> >
> > On Thu, Apr 5, 2012 at 7:30 AM, bikash sharma <sharmabiks...@gmail.com
> >wrote:
> >
> >> Thanks Kai, I will try those.
> >>
> >> On Thu, Apr 5, 2012 at 3:15 AM, Kai Voigt <k...@123.org> wrote:
> >>
> >> > Hi,
> >> >
> >> > Am 05.04.2012 um 00:20 schrieb bikash sharma:
> >> >
> >> > > Is it possible to get the execution time of the constituent
> map/reduce
> >> > > tasks of a MapReduce job (say sort) at the end of a job run?
> >> > > Preferably, can we obtain this programatically?
> >> >
> >> >
> >> > you can access the JobTracker's web UI and see the start and stop
> >> > timestamps for every individual task.
> >> >
> >> > Since the JobTracker Java API is exposed, you can write your own
> >> > application to fetch that data through your own code.
> >> >
> >> > Also, "hadoop job" on the command line can be used to read job
> >> statistics.
> >> >
> >> > Kai
> >> >
> >> >
> >> > --
> >> > Kai Voigt
> >> > k...@123.org
> >> >
> >> >
> >> >
> >> >
> >> >
> >>
> >
> >
> >
> > --
> > Jay Vyas
> > MMSB/UCHC
> >
>
>
>
> --
> Jay Vyas
> MMSB/UCHC
>

Reply via email to