Ok, my fault. It is not about the PatternLayout. But deleting the file Flume is
currently using without restarting the agent is not a good idea. ;-)
Cheers
Seb.
Am 19.05.2014 um 20:46 schrieb Sebastian Gäde :
> Hi,
>
> thanks, worked for me, (using
> http://stackoverflow.com/questions/6072389
Hi,
thanks, worked for me, (using
http://stackoverflow.com/questions/6072389/how-to-create-a-own-appender-in-log4j):
public class MyFlumeAppender extends AppenderSkeleton {
MyRpcClientFacade client = new MyRpcClientFacade();
public MyFlumeAppender() {
super();
String flumeSer
Task JVMs in MR run with their own TaskLogAppender-configured
task-log4j.properties file. This is not overridable or configurable
currently.
Your best approach at the moment would be to configure your custom
appender in code at the beginning of your program.
On Sun, May 18, 2014 at 9:27 PM, Sebas
Please see the reply from oers in this thread:
http://stackoverflow.com/questions/9081625/override-log4j-properties-in-hadoop
On May 18, 2014, at 10:38 AM, bo yang wrote:
> It might be caused by multiple log4j.properties files in your class path
> (e.g. in different jar files). For example, I
It might be caused by multiple log4j.properties files in your class path
(e.g. in different jar files). For example, I find
hadoop-mapreduce-client-jobclient-2.4.0-tests.jar
in my class path, and there is log4j.properties inside it. I have to
manually delete it to get logging written to files.
On
Hi,
I'd like to log events within my MR application using Flume writing to a
central log file. I've followed
http://flume.apache.org/releases/content/1.4.0/FlumeUserGuide.html#log4j-appender
putting the stuff into Hadoop's log4j.properties and copying it to
Hadoop's conf directory on all node