What I found out is that the default conf/log4j.properties set root with 
INFO and indeed anything beyond INFO (hadoop's or my own codes') show up. 
However, I tried to put a new log4j.properties with lower threshold in the 
new conf directory and specify it with "--configure" option and it did not 
work (it did pick up other things such as mapreq-site.xml). Unfortunately, 
I am not the administrator and do not have the priviledge to modify the 
default log4j.properties.  Do I have to ask the administrator to do it for 
me?  Thanks. 

Zhu, Guojun
Modeling Sr Graduate
571-3824370
guojun_...@freddiemac.com
Financial Engineering
Freddie Mac



   GUOJUN Zhu <guojun_...@freddiemac.com> 
   02/27/2012 11:34 AM
   Please respond to
mapreduce-user@hadoop.apache.org


To
"mapreduce-user@hadoop.apache.org" <mapreduce-user@hadoop.apache.org>
cc

Subject
no log function for map/red in a cluster setup







Hi.   

We are testing hadoop.  We are using hadoop (0.20.2-cdh3u3).  I am using 
the cotomized conf directory with -"-config mypath".  I modified the 
log4j.properties file in this path, adding "
log4j.logger.com.mycompany=DEBUG".   It works fine with our 
pseudo-one-node-cluster setup (1.00).  But in the new cluster (with 32 
data nodes/name node/secondary namenode/jobtracker/backup jobtracker), I 
can only see the log from hadoop (in the web interface, when I navigate 
all the way into the task node log), but no logs from my mapper/reducer 
(com.mycompany.***) show up.  I can do System.out.println or 
System.err.println and see them in the same log file,  but no logs from 
log4j show up.  Is there any other configuration I missed?  Thanks. 

Zhu, Guojun
Modeling Sr Graduate
571-3824370
guojun_...@freddiemac.com
Financial Engineering
Freddie Mac

Reply via email to