How is the central log4j file made available to the tasks? After you make
your changes to the configuration file, does it help if you restart the task
trackers?

You could also try setting the log level programmatically in your "void
setup(Context)" method:

@Override
protected void setup(Context context) {
  logger.setLevel(Level.DEBUG);
}

- Aaron

On Wed, Dec 15, 2010 at 2:23 PM, W.P. McNeill <bill...@gmail.com> wrote:

> I'm running on a cluster.  I'm trying to write to the log files on the
> cluster machines, the ones that are visible through the jobtracker web
> interface.
>
> The log4j file I gave excerpts from is a central one for the cluster.
>
> On Wed, Dec 15, 2010 at 1:38 PM, Aaron Kimball <akimbal...@gmail.com>
> wrote:
>
> > W. P.,
> >
> > How are you running your Reducer? Is everything running in standalone
> mode
> > (all mappers/reducers in the same process as the launching application)?
> Or
> > are you running this in pseudo-distributed mode or on a remote cluster?
> >
> > Depending on the application's configuration, log4j configuration could
> be
> > read from one of many different places.
> >
> > Furthermore, where are you expecting your output? If you're running in
> > pseudo-distributed (or fully distributed) mode, mapper / reducer tasks
> will
> > not emit output back to the console of the launching application.  That
> > only
> > happens in local mode. In the distributed flavors, you'll see a different
> > file for each task attempt containing its log output, on the machine
> where
> > the task executed. These files can be accessed through the web UI at
> > http://jobtracker:50030/ -- click on the job, then the task, then the
> task
> > attempt, then "syslog" in the right-most column.
> >
> > - Aaron
> >
> > On Mon, Dec 13, 2010 at 10:05 AM, W.P. McNeill <bill...@gmail.com>
> wrote:
> >
> > > I would like to use Hadoop's Log4j infrastructure to do logging from my
> > > map/reduce application.  I think I've got everything set up correctly,
> > but
> > > I
> > > am still unable to specify the logging level I want.
> > >
> > > By default Hadoop is set up to log at level INFO.  The first line of
> its
> > > log4j.properties file looks like this:
> > >
> > > hadoop.root.logger=INFO,console
> > >
> > >
> > > I have an application whose reducer looks like this:
> > >
> > > package com.me;
> > >
> > > public class MyReducer<...> extends Reducer<...> {
> > >   private static Logger logger =
> > > Logger.getLogger(MyReducer.class.getName());
> > >
> > >   ...
> > >   protected void reduce(...) {
> > >       logger.debug("My message");
> > >       ...
> > >   }
> > > }
> > >
> > >
> > > I've added the following line to the Hadoop log4j.properties file:
> > >
> > > log4j.logger.com.me.MyReducer=DEBUG
> > >
> > >
> > > I expect the Hadoop system to log at level INFO, but my application to
> > log
> > > at level DEBUG, so that I see "My message" in the logs for the reducer
> > > task.
> > >  However, my application does not produce any log4j output.  If I
> change
> > > the
> > > line in my reducer to read logger.info("My message") the message does
> > get
> > > logged, so somehow I'm failing to specify that log level for this
> class.
> > >
> > > I've also tried changing the log4j line for my app to
> > > read log4j.logger.com.me.MyReducer=DEBUG,console and get the same
> result.
> > >
> > > I've been through the Hadoop and log4j documentation and I can't figure
> > out
> > > what I'm doing wrong.  Any suggestions?
> > >
> > > Thanks.
> > >
> >
>

Reply via email to