Hi that is because I send not proper syslog data so I assumed that the
WARNing is ok. I have a working example with the same Source (syslog) the
same badly formed data and sink to hdfs.
To send the data I used NC
$ nc -4 -u myhadoopNN
manual message
Just for trying I changed the source to
Hi
This is a newbie question.
I have configured flume sink with the Rolling File Sink
I have the following in my flume-conf.properties page
agent1.sinks.purepath.type = com.x.diagnostics.flume.RollingFileSink
# once an hour
agent1.sinks.purepath.sink.rollInterval = 3600
# Force cutoff at 100
Does not look like you are using the http source at all. Your source type needs
to be HTTP
Cheers,
Hari
On Thursday, June 20, 2013 at 8:57 AM, Nickolay Kolev wrote:
Hi all,
I am new to flume and all that logging stuff and probably many things are
unclear to me despite I read the docs.
Hello All, I'm trying to load the app servers request logs in Hadoop hdfs.
I get all the consolidate logs in one file for a day. I'm running the flume
agent with following config:
##
agent.sources = apache
agent.sources.apache.type = exec
agent.sources.apache.command = cat
Hi,
I created new table in the HBase and I inserted rows and cols
successfully using normal HBasesink. When I changed HBasesink to
AsyncHBase sink no rows or cols inserting. My requirement is I have
100's of text files which are generated by some process. I want to read
those text files and