Hi Hari, As I said I am not java programmer and I do not think this is quality code. It is suited for our situation - we log JSON messages into ElasticSearch If this code can be base for a solution that can be part of future releases. What I have made is custom serializer and changes in the sink to check is payload is valid and skip invalid events. More information can be found here: https://github.com/lem0na/flume/commit/60d753ebc86eddda579e3b871b0a8ab62a0f2794
best regards, Nickolay Kolev On Wed, Nov 19, 2014 at 11:57 PM, Hari Shreedharan < [email protected]> wrote: > If you submit your patch to Flume, we can hopefully commit it to the sink > bundled with Flume > > Thanks, > Hari > > > On Wed, Nov 19, 2014 at 12:42 AM, Nickolay Kolev <[email protected]> > wrote: > >> Hi, >> >> It looks like ES sink is oriented to handle text payload and not json and >> there are problems with serialization. >> We have the same problem and I have developed patch and custom version >> based on sources (two months ago) >> I am not java developer so probably it is not the best solution but it >> works. >> Now it is hosted on company's internal repo but if you have interest I >> can upload it on GitHub or Bitbucket >> >> best regards, >> Nickolay Kolev >> >> >> On Mon, Nov 17, 2014 at 11:47 AM, shadyxu <[email protected]> wrote: >> >>> Hi everyone, >>> >>> I am now using Flume to collect log into ElasticSearch. And the logs are >>> in json format. However, when I check them in ES, it seems that Flume has >>> put the entire json log in the @message attribute. Is there any config to >>> do or need I do some coding to separate them into different columns as ES >>> always does? >>> >>> BTW, I found that ttl seems not working in ElasticSearch. I'm using >>> Flume 1.5.0.1. >>> >>> Any clue shall be appreciated. >>> >>> >>> >> >> >
