Dear Sharminder, Thanks for your reply, yes I am playing with flume, as you suggested i am using spool directory source, My configuration file looks like
*tier1.sources = source1 tier1.channels = channel1 tier1.sinks = sink1 tier1.sources.source1.type = spooldir tier1.sources.source1.spoolDir = /var/log/messages tier1.sources.source1.channels = channel1 tier1.channels.channel1.type = memory tier1.sinks.sink1.type = hdfs tier1.sinks.sink1.hdfs.path = hdfs://localhost:8020/flume/messages tier1.sinks.sink1.hdfs.fileType = SequenceFile tier1.sinks.sink1.hdfs.filePrefix = data tier1.sinks.sink1.hdfs.fileSuffix = .seq # Roll based on the block size only tier1.sinks.sink1.hdfs.rollCount=0 tier1.sinks.sink1.hdfs.rollInterval=0 tier1.sinks.sink1.hdfs.rollSize = 120000000 # seconds to wait before closing the file. tier1.sinks.sink1.hdfs.idleTimeout = 60 tier1.sinks.sink1.channel = channel1 tier1.channels.channel1.capacity = 100000* tier1.sources.source1.deserializer.maxLineLength = 32768 the command I used is ./flume-ng agent --conf ./conf/ -f bin/example.conf -Dflume.root.logger=DEBUG,console -n agent it gives warn after created sources, channels, sinks for tier1 agent is no configuration found for this host:agent any help? On Sun, Jun 15, 2014 at 11:18 AM, Sharninder <[email protected]> wrote: > > I want to copy my local data to hdfs using flume in a single machine which >> isrunning hadoop, How can I do that, please help me. >> >> What is this "local data" ? > > If it's just files, why not use the hadoop fs copy command instead? If you > want to play around with flume, take a look at the spool directory source > or the exec source and you should be able to put something together that'll > push data through flume to hadoop. > > -- > Sharninder > > -- Thanks, Kishore.
