Ok , I am downloading hadoop 1.0.3. Thanks
On Tue, Jul 31, 2012 at 3:26 PM, Brock Noland <[email protected]> wrote: > Apache Hadoop 1.X or CDH3/4 > > On Tue, Jul 31, 2012 at 9:20 AM, mardan Khan <[email protected]> wrote: > > Could you please let me know, which version of hadoop would be suitable > for > > the flume1.2.0 version. So I wil configure that version of hadoop. > > > > > > Thanks > > > > > > On Tue, Jul 31, 2012 at 2:56 PM, Brock Noland <[email protected]> > wrote: > >> > >> Ahh, Hadoop-0.20.0 is ancient. I would try a more recent version. > >> > >> On Tue, Jul 31, 2012 at 8:31 AM, mardan Khan <[email protected]> > wrote: > >> > HI Brock, > >> > > >> > I have run another configuration and give me the same error > dependencies > >> > were not found in classpath > >> > > >> > I am using the following: > >> > Flume1.2.0 > >> > Window 7 operating system > >> > Cygwin. > >> > Hadoop-0.20.0 > >> > > >> > The configuration file as: > >> > > >> > agent1.sources = source1 > >> > agent1.sinks = sink1 > >> > agent1.channels = channel1 > >> > > >> > # Describe/configure source1 > >> > agent1.sources.source1.type = netcat > >> > agent1.sources.source1.bind = localhost > >> > agent1.sources.source1.port = 23 > >> > > >> > # Describe sink1 > >> > #agent1.sinks.sink1.type = logger > >> > agent1.sinks.sink1.type = hdfs > >> > agent1.sinks.sink1.hdfs.path = > >> > hdfs://localhost:9000/user/user-pc/cyg_server/flume > >> > # Use a channel which buffers events in memory > >> > agent1.channels.channel1.type = memory > >> > agent1.channels.channel1.capacity = 1000 > >> > agent1.channels.channel1.transactionCapactiy = 100 > >> > > >> > # Bind the source and sink to the channel > >> > agent1.sources.source1.channels = channel1 > >> > agent1.sinks.sink1.channel = channel1 > >> > > >> > > >> > When keep the sinks.type = logger then work fine but when change to > hdfs > >> > then give me the dependencies error. It is problem of hdfs. > >> > > >> > > >> > > >> > The error message as: > >> > > >> > > >> > 2012-07-31 14:18:55,138 (conf-file-poller-0) [ERROR - > >> > org.apache.flume.conf.file > >> > > >> > > >> > > .AbstractFileConfigurationProvider$FileWatcherRunnable.run(AbstractFileConfigura > >> > > >> > tionProvider.java:207)] Failed to start agent because dependencies > were > >> > not > >> > foun > >> > > >> > d in classpath. Error follows. > >> > > >> > java.lang.NoClassDefFoundError: > >> > org/apache/hadoop/io/SequenceFile$CompressionTyp > >> > > >> > e > >> > > >> > at > >> > org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java > >> > > >> > :205) > >> > > >> > at > >> > org.apache.flume.conf.Configurables.configure(Configurables.java:41) > >> > > >> > at > >> > org.apache.flume.conf.properties.PropertiesFileConfigurationProvider. > >> > > >> > loadSinks(PropertiesFileConfigurationProvider.java:373) > >> > > >> > at > >> > org.apache.flume.conf.properties.PropertiesFileConfigurationProvider. > >> > > >> > load(PropertiesFileConfigurationProvider.java:223) > >> > > >> > at > >> > org.apache.flume.conf.file.AbstractFileConfigurationProvider.doLoad(A > >> > > >> > bstractFileConfigurationProvider.java:123) > >> > > >> > at > >> > org.apache.flume.conf.file.AbstractFileConfigurationProvider.access$3 > >> > > >> > 00(AbstractFileConfigurationProvider.java:38) > >> > > >> > at > >> > org.apache.flume.conf.file.AbstractFileConfigurationProvider$FileWatc > >> > > >> > herRunnable.run(AbstractFileConfigurationProvider.java:202) > >> > > >> > at > >> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:47 > >> > > >> > 1) > >> > > >> > at > >> > java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java > >> > > >> > :351) > >> > > >> > at > >> > java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178) > >> > > >> > at > >> > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask. > >> > > >> > access$301(ScheduledThreadPoolExecutor.java:178) > >> > > >> > at > >> > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask. > >> > > >> > run(ScheduledThreadPoolExecutor.java:293) > >> > > >> > at > >> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor. > >> > > >> > java:1110) > >> > > >> > at > >> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor > >> > > >> > .java:603) > >> > > >> > at java.lang.Thread.run(Thread.java:722) > >> > > >> > Caused by: java.lang.ClassNotFoundException: > >> > org.apache.hadoop.io.SequenceFile$C > >> > > >> > ompressionType > >> > > >> > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > >> > > >> > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > >> > > >> > at java.security.AccessController.doPrivileged(Native Method) > >> > > >> > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > >> > > >> > at java.lang.ClassLoader.loadClass(ClassLoader.java:423) > >> > > >> > at > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > >> > > >> > at java.lang.ClassLoader.loadClass(ClassLoader.java:356) > >> > > >> > ... 15 more > >> > > >> > > >> > > >> > > >> > > >> > Can you please solve this my problem. > >> > > >> > > >> > Thanks > >> > > >> > > >> > > >> > On Tue, Jul 31, 2012 at 1:14 AM, Brock Noland <[email protected]> > >> > wrote: > >> >> > >> >> Hi, > >> >> > >> >> It looks like you are hitting: > >> >> https://issues.apache.org/jira/browse/FLUME-1389 > >> >> > >> >> The error should be below but I don't see it and this configuration > >> >> works for me. Are you sure this is the exact configuration which is > >> >> being used? > >> >> > >> >> > agent1.sources.tail.interceptors = hostint > >> >> > agent1.sources.tail.interceptors.hostint.type = > >> >> > org.apache.flume.interceptor.HostInterceptor$Builder > >> >> > agent1.sources.tail.interceptors.hostint.preserveExisting = true > >> >> > agent1.sources.tail.interceptors.hostint.useIP = false > >> >> > >> >> Brock > >> > > >> > > >> > >> > >> > >> -- > >> Apache MRUnit - Unit testing MapReduce - > >> http://incubator.apache.org/mrunit/ > > > > > > > > -- > Apache MRUnit - Unit testing MapReduce - > http://incubator.apache.org/mrunit/ >
