Hello list,

     I am trying to collect apache server logs into the Hdfs using
Flume-NG, but getting this error -

mohammad@ubuntu:~/flume-1.2.0-incubating-SNAPSHOT$ bin/flume-ng agent
-n agent1 -f conf/agent1.conf
Warning: No configuration directory set! Use --conf <dir> to override.
Info: Including Hadoop libraries found via
(/home/mohammad/hadoop-0.20.203.0/bin/hadoop) for HDFS access
Info: Excluding
/home/mohammad/hadoop-0.20.203.0/bin/../lib/slf4j-api-1.4.3.jar from
classpath
Info: Excluding
/home/mohammad/hadoop-0.20.203.0/bin/../lib/slf4j-log4j12-1.4.3.jar
from classpath
Info: Including HBASE libraries found via
(/home/mohammad/hbase-0.90.4/bin/hbase) for HBASE access
Info: Excluding /home/mohammad/hbase-0.90.4/lib/slf4j-api-1.5.8.jar
from classpath
Info: Excluding
/home/mohammad/hbase-0.90.4/lib/slf4j-log4j12-1.5.8.jar from classpath
+ exec /usr/lib/jvm/java-6-sun/bin/java -Xmx20m -cp
'/home/mohammad/flume-1.2.0-incubating-SNAPSHOT/lib/*:/home/mohammad/hadoop-0.20.203.0/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/mohammad/hadoop-0.20.203.0/bin/..:/home/mohammad/hadoop-0.20.203.0/bin/../hadoop-core-0.20.203.0.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/aspectjrt-1.6.5.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/aspectjtools-1.6.5.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-beanutils-1.7.0.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-beanutils-core-1.8.0.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-cli-1.2.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-codec-1.4.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-collections-3.2.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-configuration-1.6.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-daemon-1.0.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-digester-1.8.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-el-1.0.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-httpclient-3.0.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-lang-2.4.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-logging-1.1.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-logging-api-1.0.4.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-math-2.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/commons-net-1.4.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/core-3.1.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/hsqldb-1.8.0.10.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jackson-core-asl-1.0.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jackson-mapper-asl-1.0.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jasper-compiler-5.5.12.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jasper-runtime-5.5.12.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jets3t-0.6.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jetty-6.1.26.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jetty-util-6.1.26.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jsch-0.1.42.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/junit-4.5.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/kfs-0.2.2.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/log4j-1.2.15.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/mockito-all-1.8.5.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/oro-2.0.8.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/servlet-api-2.5-20081211.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/xmlenc-0.52.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jsp-2.1/jsp-2.1.jar:/home/mohammad/hadoop-0.20.203.0/bin/../lib/jsp-2.1/jsp-api-2.1.jar:/home/mohammad/hbase-0.90.4/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/mohammad/hbase-0.90.4:/home/mohammad/hbase-0.90.4/hbase-0.90.4.jar:/home/mohammad/hbase-0.90.4/hbase-0.90.4-tests.jar:/home/mohammad/hbase-0.90.4/lib/activation-1.1.jar:/home/mohammad/hbase-0.90.4/lib/asm-3.1.jar:/home/mohammad/hbase-0.90.4/lib/avro-1.3.3.jar:/home/mohammad/hbase-0.90.4/lib/commons-cli-1.2.jar:/home/mohammad/hbase-0.90.4/lib/commons-codec-1.4.jar:/home/mohammad/hbase-0.90.4/lib/commons-configuration-1.6.jar:/home/mohammad/hbase-0.90.4/lib/commons-el-1.0.jar:/home/mohammad/hbase-0.90.4/lib/commons-httpclient-3.1.jar:/home/mohammad/hbase-0.90.4/lib/commons-lang-2.5.jar:/home/mohammad/hbase-0.90.4/lib/commons-logging-1.1.1.jar:/home/mohammad/hbase-0.90.4/lib/commons-net-1.4.1.jar:/home/mohammad/hbase-0.90.4/lib/core-3.1.1.jar:/home/mohammad/hbase-0.90.4/lib/guava-r06.jar:/home/mohammad/hbase-0.90.4/lib/hadoop-core-0.20.203.0.jar:/home/mohammad/hbase-0.90.4/lib/hadoop-core-0.20-append-r1056497.jar:/home/mohammad/hbase-0.90.4/lib/jackson-core-asl-1.5.5.jar:/home/mohammad/hbase-0.90.4/lib/jackson-jaxrs-1.5.5.jar:/home/mohammad/hbase-0.90.4/lib/jackson-mapper-asl-1.4.2.jar:/home/mohammad/hbase-0.90.4/lib/jackson-xc-1.5.5.jar:/home/mohammad/hbase-0.90.4/lib/jasper-compiler-5.5.23.jar:/home/mohammad/hbase-0.90.4/lib/jasper-runtime-5.5.23.jar:/home/mohammad/hbase-0.90.4/lib/jaxb-api-2.1.jar:/home/mohammad/hbase-0.90.4/lib/jaxb-impl-2.1.12.jar:/home/mohammad/hbase-0.90.4/lib/jersey-core-1.4.jar:/home/mohammad/hbase-0.90.4/lib/jersey-json-1.4.jar:/home/mohammad/hbase-0.90.4/lib/jersey-server-1.4.jar:/home/mohammad/hbase-0.90.4/lib/jettison-1.1.jar:/home/mohammad/hbase-0.90.4/lib/jetty-6.1.26.jar:/home/mohammad/hbase-0.90.4/lib/jetty-util-6.1.26.jar:/home/mohammad/hbase-0.90.4/lib/jruby-complete-1.6.0.jar:/home/mohammad/hbase-0.90.4/lib/jsp-2.1-6.1.14.jar:/home/mohammad/hbase-0.90.4/lib/jsp-api-2.1-6.1.14.jar:/home/mohammad/hbase-0.90.4/lib/jsr311-api-1.1.1.jar:/home/mohammad/hbase-0.90.4/lib/log4j-1.2.16.jar:/home/mohammad/hbase-0.90.4/lib/protobuf-java-2.3.0.jar:/home/mohammad/hbase-0.90.4/lib/servlet-api-2.5-6.1.14.jar:/home/mohammad/hbase-0.90.4/lib/stax-api-1.0.1.jar:/home/mohammad/hbase-0.90.4/lib/thrift-0.2.0.jar:/home/mohammad/hbase-0.90.4/lib/xmlenc-0.52.jar:/home/mohammad/hbase-0.90.4/lib/zookeeper-3.3.2.jar:/home/mohammad/hbase-0.90.4/conf'
-Djava.library.path=:/home/mohammad/hadoop-0.20.203.0/bin/../lib/native/Linux-amd64-64:/usr/lib/jvm/java-6-sun-1.6.0.30/jre/lib/amd64/server:/usr/lib/jvm/java-6-sun-1.6.0.30/jre/lib/amd64:/usr/lib/jvm/java-6-sun-1.6.0.30/jre/../lib/amd64:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
org.apache.flume.node.Application -n agent1 -f conf/agent1.conf
12/06/10 05:23:15 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 1
12/06/10 05:23:15 INFO node.FlumeNode: Flume node starting - agent1
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Node
manager starting
12/06/10 05:23:15 INFO properties.PropertiesFileConfigurationProvider:
Configuration provider starting
12/06/10 05:23:15 INFO lifecycle.LifecycleSupervisor: Starting
lifecycle supervisor 10
12/06/10 05:23:15 INFO properties.PropertiesFileConfigurationProvider:
Reloading configuration file:conf/agent1.conf
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Processing:HDFS
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Processing:HDFS
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Processing:HDFS
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Processing:HDFS
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Added sinks: HDFS Agent: agent1
12/06/10 05:23:15 INFO conf.FlumeConfiguration: Post-validation flume
configuration contains configuration  for agents: [agent1]
12/06/10 05:23:15 INFO properties.PropertiesFileConfigurationProvider:
Creating channels
12/06/10 05:23:15 INFO properties.PropertiesFileConfigurationProvider:
created channel MemoryChannel-2
12/06/10 05:23:15 INFO sink.DefaultSinkFactory: Creating instance of
sink HDFS typehdfs
12/06/10 05:23:15 INFO hdfs.HDFSEventSink: Hadoop Security enabled: false
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Starting
new configuration:{ sourceRunners:{tail=EventDrivenSourceRunner: {
source:org.apache.flume.source.ExecSource@8f0c85e }}
sinkRunners:{HDFS=SinkRunner: {
policy:org.apache.flume.sink.DefaultSinkProcessor@77f297e7
counterGroup:{ name:null counters:{} } }}
channels:{MemoryChannel-2=org.apache.flume.channel.MemoryChannel@57d2fc36}
}
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Starting
Channel MemoryChannel-2
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Waiting
for channel: MemoryChannel-2 to start. Sleeping for 500 ms
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Starting Sink HDFS
12/06/10 05:23:15 INFO nodemanager.DefaultLogicalNodeManager: Starting
Source tail
12/06/10 05:23:15 INFO source.ExecSource: Exec source starting with
command:tail -f /var/log/apache2/access.log
12/06/10 05:23:16 INFO hdfs.BucketWriter: Creating
hdfs://localhost:9000/flume/FlumeData.1339285995904.tmp
12/06/10 05:23:16 ERROR hdfs.HDFSEventSink: process failed
java.lang.NoSuchMethodError: org.apache.hadoop.io.SequenceFile$Writer.syncFs()V
        at 
org.apache.flume.sink.hdfs.HDFSSequenceFile.sync(HDFSSequenceFile.java:77)
        at 
org.apache.flume.sink.hdfs.BucketWriter.doFlush(BucketWriter.java:276)
        at 
org.apache.flume.sink.hdfs.BucketWriter.access$500(BucketWriter.java:46)
        at org.apache.flume.sink.hdfs.BucketWriter$4.run(BucketWriter.java:265)
        at org.apache.flume.sink.hdfs.BucketWriter$4.run(BucketWriter.java:262)
        at 
org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:120)
        at org.apache.flume.sink.hdfs.BucketWriter.flush(BucketWriter.java:262)
        at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:322)
        at 
org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:671)
        at 
org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:668)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
java.lang.NoSuchMethodError:
org.apache.hadoop.io.SequenceFile$Writer.syncFs()V
        at 
org.apache.flume.sink.hdfs.HDFSSequenceFile.sync(HDFSSequenceFile.java:77)
        at 
org.apache.flume.sink.hdfs.BucketWriter.doFlush(BucketWriter.java:276)
        at 
org.apache.flume.sink.hdfs.BucketWriter.access$500(BucketWriter.java:46)
        at org.apache.flume.sink.hdfs.BucketWriter$4.run(BucketWriter.java:265)
        at org.apache.flume.sink.hdfs.BucketWriter$4.run(BucketWriter.java:262)
        at 
org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:120)
        at org.apache.flume.sink.hdfs.BucketWriter.flush(BucketWriter.java:262)
        at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:322)
        at 
org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:671)
        at 
org.apache.flume.sink.hdfs.HDFSEventSink$1.call(HDFSEventSink.java:668)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
12/06/10 05:23:46 INFO hdfs.BucketWriter: Renaming
hdfs://localhost:9000/flume/FlumeData.1339285995904.tmp to
hdfs://localhost:9000/flume/FlumeData.1339285995904

I have written following configuration for my agent -

agent1.sources = tail
agent1.channels = MemoryChannel-2
agent1.sinks = HDFS

agent1.sources.tail.type = exec
agent1.sources.tail.command = tail -f /var/log/apache2/access.log

agent1.sources.tail.channels = MemoryChannel-2
agent1.sinks.HDFS.channel = MemoryChannel-2

agent1.sinks.HDFS.type = hdfs
agent1.sinks.HDFS.hdfs.path = hdfs://localhost:9000/flume
agent1.sinks.HDFS.hdfs.file.Type = SequenceFile

agent1.channels.MemoryChannel-2.type = memory

Need help..Many thanks.

Regards,
    Mohammad Tariq

Reply via email to