Nachiket, Not sure which hadoop version/package you are using.
If default hadoop, you might like to check ${hadoop.log.dir}/userlogs/xxx, ${hadoop.log.dir}/logs/history, [${hadoop.log.dir} defaults to /tmp if nothing specified] and maybe /grid/0/hadoop/var/log/ .Check also - http://hadoop.apache.org/common/docs/r0.20.0/mapred_tutorial.html#Debugging http://wiki.apache.org/hadoop/HowToDebugMapReducePrograms If the cloudera version, there is a detailed note @ http://www.facebook.com/note.php?note_id=134161942002 At times if job setup is unsuccessful, logs might not be seen, also dependent on if you have set log configs false. JT UI has the link to log directory, history logs etc. Cheers, / On 4/6/10 7:46 PM, "steven zhuang" <steven.zhuang.1...@gmail.com> wrote: hi, Nachiket, * I think if you output something to stderr, you should be able to find it in the .out log. * *Just make sure you are checking the right .out log file, * *you can do that by checking which tasktrackers are running you job from the web UI.* * * * * On Tue, Apr 6, 2010 at 6:56 PM, Nachiket Vaidya <nachik...@gmail.com> wrote: > I have the following doubts: > 1. How to print log information in Hadoop. In the documentation, I have > read > that hadoop-<username>-<processname>-<machinename>.log contains logs. I > have > used > Log log = LogFactory.getLog(FBEMMapper.class); > and > log.info(....); > > for printing into log, but I do not see any log information in log file. I > have also used System.out.println() but these are also not getting printed > in .log or .out file. > Do we need to change some log level in hadoop? > Do we need to enable logging for some class? > which log4j.properties file we need to change? > > Firstly, am I doing right things for logging? > > Actually the problem is I have written my custom FileInputFormat > and WritableComparable for my purpose. My program runs fine, but I do not > see any output. That is why I need to print some log statement to debug the > problem. > > Thank you. > > > - Nachiket >