Thanks all!
Trying out the history API, I ran into a spot of trouble: I'd like to log
history to a custom location
(It is by default the job output dir in DFS).

However, setting hadoop.job.history.location (I tried adding it to the
core- and mapred-site xml's)
doesn't seem to help. The path it points to exists on disk.
What could I be missing?

On 5 March 2012 16:27, Charles Earl <charles.ce...@gmail.com> wrote:

> In terms of accessing metrics(2) programmatically, do people generally
> extend FileSink for collecting data from small (1 - 15 node) installations?
> As opposed to using Chukwa, etc.
> C
> On Mar 4, 2012, at 7:08 PM, George Datskos wrote:
>
> > Bharath,
> >
> > Try the hadoop job -history API
> >
> >
> >
> > On 2012/03/05 8:06, Bharath Ravi wrote:
> >> The Web UI does give me start and finish times, but I was wondering if
> there is
> >> a way to access these stats through an API, without having to grep
> through HTML.
> >>
> >> The "hadoop jobs -status" API was useful, but it doesn't seem to list
> wall completion times.
> >> (It does give me CPU time though). Am I missing something?
> >
> >
>
>


-- 
Bharath Ravi

Reply via email to