Hi Kishore,

the issue I think is

tier1.sinks.sink1.hdfs.rollInterval=0
tier1.sinks.sink1.hdfs.rollSize = 120000000
# seconds to wait before closing the file.
tier1.sinks.sink1.hdfs.idleTimeout = 60

Can you try getting rid of the idleTimeout and changing rollInterval to 30, see 
if that helps?

Regards,

Guy Needham | Data Discovery
Virgin Media | Enterprise Data, Design & Management
Bartley Wood Business Park, Hook, Hampshire RG27 9UP
D 01256 75 3362

I welcome VSRE emails. Learn more at http://vsre.info/



________________________________
From: kishore alajangi [mailto:[email protected]]
Sent: 16 June 2014 08:04
To: [email protected]
Subject: Re: copy to hdfs

could anybody help me ?


On Mon, Jun 16, 2014 at 10:27 AM, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
Instead just mentioning hdsfs.path = /flume/messages/, do i need to mention 
something else?


On Mon, Jun 16, 2014 at 10:25 AM, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
I created the /flume/messages directories, but still nothing is written with 
flume in those directories. please help me.


On Mon, Jun 16, 2014 at 10:15 AM, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
Do I need to create the /flume/messages/ directories?



On Mon, Jun 16, 2014 at 10:14 AM, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
checked, nothing is written in hdfs.


On Mon, Jun 16, 2014 at 10:10 AM, Sharninder 
<[email protected]<mailto:[email protected]>> wrote:
That just means source has done its work and is waiting for more data to read. 
Did you check hdfs to see if all data has been written?



On Mon, Jun 16, 2014 at 11:34 AM, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
Hi Mohit and sharminder,

Thanks for reply, after I called with -n tier, source is not directory error 
came, I changed the source to /tmp/ and hdfs.path to /flume/messages/ in config 
file, and run the command, the INFO i am getting now is "spooling directory 
source runner has shutdown"
what could be the problem, please help me.


On Sun, Jun 15, 2014 at 10:21 PM, Mohit Durgapal 
<[email protected]<mailto:[email protected]>> wrote:
Replace -n agent with -n tier1


On Sunday, June 15, 2014, kishore alajangi 
<[email protected]<mailto:[email protected]>> wrote:
Dear Sharminder,

Thanks for your reply, yes I am playing with flume, as you suggested i am using 
spool directory source, My configuration file looks like

tier1.sources  = source1
tier1.channels = channel1
tier1.sinks    = sink1

tier1.sources.source1.type     = spooldir
tier1.sources.source1.spoolDir = /var/log/messages
tier1.sources.source1.channels = channel1
tier1.channels.channel1.type   = memory

tier1.sinks.sink1.type         = hdfs
tier1.sinks.sink1.hdfs.path = hdfs://localhost:8020/flume/messages
tier1.sinks.sink1.hdfs.fileType = SequenceFile
tier1.sinks.sink1.hdfs.filePrefix = data
tier1.sinks.sink1.hdfs.fileSuffix = .seq

# Roll based on the block size only
tier1.sinks.sink1.hdfs.rollCount=0
tier1.sinks.sink1.hdfs.rollInterval=0
tier1.sinks.sink1.hdfs.rollSize = 120000000
# seconds to wait before closing the file.
tier1.sinks.sink1.hdfs.idleTimeout = 60
tier1.sinks.sink1.channel      = channel1

tier1.channels.channel1.capacity = 100000
tier1.sources.source1.deserializer.maxLineLength = 32768











the command I used is











./flume-ng agent --conf ./conf/ -f bin/example.conf 
-Dflume.root.logger=DEBUG,console -n agent











it gives warn after created sources, channels, sinks for tier1 agent is











no configuration found for this host:agent











any help?













On Sun, Jun 15, 2014 at 11:18 AM, Sharninder <[email protected]> wrote:

I want to copy my local data to hdfs using flume in a single machine which 
isrunning hadoop, How can I do that, please help me.

What is this "local data" ?

If it's just files, why not use the hadoop fs copy command instead? If you want 
to play around with flume, take a look at the spool directory source or the 
exec source and you should be able to put something together that'll push data 
through flume to hadoop.

--
Sharninder




--
Thanks,
Kishore.



--
Thanks,
Kishore.




--
Thanks,
Kishore.



--
Thanks,
Kishore.



--
Thanks,
Kishore.



--
Thanks,
Kishore.



--
Thanks,
Kishore.

--------------------------------------------------------------------
Save Paper - Do you really need to print this e-mail?

Visit www.virginmedia.com for more information, and more fun.

This email and any attachments are or may be confidential and legally privileged
and are sent solely for the attention of the addressee(s). If you have received 
this
email in error, please delete it from your system: its use, disclosure or 
copying is
unauthorised. Statements and opinions expressed in this email may not represent
those of Virgin Media. Any representations or commitments in this email are
subject to contract. 

Registered office: Media House, Bartley Wood Business Park, Hook, Hampshire, 
RG27 9UP
Registered in England and Wales with number 2591237

Reply via email to