I have the misfortune to be learning Ambari, Hadoop, flume and Sqoop at the same time. I am running into a problem with a simple agent configuration for Flume:
I have a very simple agent I am running from the command line using this command: [root@namenode lib]# flume-ng agent –conf conf –conf-file /etc/flume/conf/a1/flume.conf -n a1 Here is the configuration file that is being managed by Ambari: a1.sources = r1 a1.sinks = k1 a1.channels = c1 a1.sources.r1.type = seq a1.sinks.k1.type = file_roll a1.sinks.k1.channel = c1 a1.sinks.k1.sink.directory = /tmp/flume a1.channels.c1.type = memory a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1 ## Ok, but I am getting this error: 15/02/05 17:25:52 ERROR flume.SinkRunner: Unable to deliver event. Exception follows. org.apache.flume.EventDeliveryException: Failed to open file /tmp/flume/1423178752254-1 while delivering event at org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:177) at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68) at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.FileNotFoundException: /tmp/flume/1423178752254-1 (No such file or directory) But here are the permissions for that directory: [root@namenode a1]# hadoop fs -ls /tmp/ Found 7 items drwxrwxrwx – root hdfs 0 2015-02-04 15:53 /tmp/flume drwxr-xr-x – admin hdfs 0 2015-01-09 18:12 /tmp/hive-beeswax-admin drwxr-xr-x – hue hdfs 0 2014-12-19 11:49 /tmp/hive-beeswax-hue I changed ownership of the directory [root@namenode a1]# sudo hadoop fs -ls /tmp/flume [root@namenode a1]# sudo hadoop fs -ls /tmp/ Found 7 items drwxrwxrwx – root root 0 2015-02-04 15:53 /tmp/flume drwxr-xr-x – admin hdfs 0 2015-01-09 18:12 /tmp/hive-beeswax-admin But I got the same results. David Novogrodsky [email protected] http://www.linkedin.com/in/davidnovogrodsky
