I know that, A simple way is to write "<custom ID> " to every LOG.info()/LOG.warn()....like this:
logger.info(ID + " start map logic"); BUT,every LOG info has to add "ID" is not wise. Or else, can someone know how to modify the mapreduce task ConversionPattern configuration? I tried to modify "RFA" Appender to this: ------------------- log4j.appender.RFA=org.apache.log4j.RollingFileAppender log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file} log4j.appender.RFA.MaxFileSize=${hadoop.log.maxfilesize} log4j.appender.RFA.MaxBackupIndex=${hadoop.log.maxbackupindex} log4j.appender.RFA.layout=org.apache.log4j.PatternLayout # Pattern format: Date LogLevel LoggerName LogMessage log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n # Debugging Pattern format #log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %-5p %l - [ID: %X{ID}] %m%n ----------------- It does not work. At 2016-11-04 11:26:57, "Maria" <linanmengxia...@126.com> wrote: > >Hi, dear developers, > >I'm trying to reconfig $HADOOP/etc.hadoop/log4j.properties, >I want to add an <custom ID> to mapreduce log before LOGmessage. Like this: >"ID:234521 start map logic" > >My steps as follow: >(1)In my Mapper Class: > >static Logger logger = LoggerFactory.getLogger(Mapper.class); >.... > >public void map(Object key, Text value, Context context) throws IOException, >InterruptedException { > > MDC.put("ID", "operatorID"); > logger.info("start map logic"); > > StringTokenizer itr = new StringTokenizer(value.toString()); > while (itr.hasMoreTokens()) { > word.set(itr.nextToken()); > context.write(word, one); > } > } > } >(2)config $HADOOP/etc.hadoop/log4j.properties > >log4j.appender.TLA=org.apache.hadoop.mapred.TaskLogAppender >log4j.appender.TLA.taskId=${hadoop.tasklog.taskid} >log4j.appender.TLA.isCleanup=${hadoop.tasklog.iscleanup} >log4j.appender.TLA.totalLogFileSize=${hadoop.tasklog.totalLogFileSize} > >log4j.appender.TLA.layout=org.apache.log4j.PatternLayout >log4j.appender.TLA.layout.ConversionPattern=%l %p %c: ID:[%X{ID}] %m%n > > >BUT it does not work. and because use slf4j API, so I don't know how to get >Appenders. > >I am desperately in need。。 >Any help would be highly appreciated >--------------------------------------------------------------------- >To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org >For additional commands, e-mail: user-h...@hadoop.apache.org 【网易自营|30天无忧退货】日本匠心设计秋冬宠物用品,限时9.9元起,还可叠加双11折扣>> --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org For additional commands, e-mail: user-h...@hadoop.apache.org