There is a single log4j file on a publicly accessible NFS path.
I haven't been able to try restarting to task trackers because the cluster
has to stay up all the time.
Setting the logging level programmatically works well though. I can turn it
on and off with a Hadoop configuration parameter,
W. P.,
How are you running your Reducer? Is everything running in standalone mode
(all mappers/reducers in the same process as the launching application)? Or
are you running this in pseudo-distributed mode or on a remote cluster?
Depending on the application's configuration, log4j configuration
I'm running on a cluster. I'm trying to write to the log files on the
cluster machines, the ones that are visible through the jobtracker web
interface.
The log4j file I gave excerpts from is a central one for the cluster.
On Wed, Dec 15, 2010 at 1:38 PM, Aaron Kimball akimbal...@gmail.com
How is the central log4j file made available to the tasks? After you make
your changes to the configuration file, does it help if you restart the task
trackers?
You could also try setting the log level programmatically in your void
setup(Context) method:
@Override
protected void setup(Context
I would like to use Hadoop's Log4j infrastructure to do logging from my
map/reduce application. I think I've got everything set up correctly, but I
am still unable to specify the logging level I want.
By default Hadoop is set up to log at level INFO. The first line of its
log4j.properties file