Re: how to get topic names in SinkFunction when using FlinkKafkaConsumer010 with multiple topics

2017-07-16 Thread Tzu-Li (Gordon) Tai
Hi, Here’s an example: DataStream inputStream = …; inputStream.addSink(new FlinkKafkaProducer09<>(     “defaultTopic”, new CustomKeyedSerializationSchema(), props)); Code for CustomKeyedSerializationSchema: public class CustomKeyedSerializationSchema implements KeyedDeserializationSchema {    

Re: Kafka Producer - Null Pointer Exception when processing by element

2017-07-16 Thread Tzu-Li (Gordon) Tai
Hi, void output(DataStream inputStream) { These seems odd. Are your events intended to be a list? If not, this should be a `DataStream`. From the code snippet you’ve attached in the first post, it seems like you’ve initialized your source incorrectly. `env.fromElements(List<...>)` will

Re: High back-pressure after recovering from a save point

2017-07-16 Thread Kien Truong
Hi, We have been testing with the FsStateBackend for the last few days and have not encountered this issue anymore. However, we will evaluate the rocksdb backend again soon because we want incremental checkpoint. I will report back if I have more updates. Best regards, Kien On Jul 15,

Re: S3 recovery and checkpoint directories exhibit explosive growth

2017-07-16 Thread SHI Xiaogang
Hi Prashantnayak Thanks a lot for reporting this problem. Can you provide more details to address it? I am guessing master has to delete too many files when a checkpoint is subsumed, which is very common in our cases. The number of files in the recovery directory will increase if the master

Re: StreamTableSource

2017-07-16 Thread Fabian Hueske
Hi, the Table API internally operates on Row. If you ingest other types they are first converted into Rows. I would recommend to convert your DataStream into a DataStream using a MapFunction and to convert that stream into a Table using TableEnvironment.fromDataStream(). Best, Fabian 2017-07-12

Re: Kafka Producer - Null Pointer Exception when processing by element

2017-07-16 Thread earellano
Tzu-Li (Gordon) Tai wrote > It seems like you’ve misunderstood how to use the FlinkKafkaProducer, or > is there any specific reason why you want to emit elements to Kafka in a > map function? > > The correct way to use it is to add it as a sink function to your > pipeline, i.e. > > DataStream >

Re: Flink Elasticsearch Connector: Lucene Error message

2017-07-16 Thread Aljoscha Krettek
Hi, There was also a problem in releasing the ES 5 connector with Flink 1.3.0. You only said you’re using Flink 1.3, would that be 1.3.0 or 1.3.1? Best, Aljoscha > On 16. Jul 2017, at 13:42, Fabian Wollert wrote: > > Hi Aljoscha, > > we are running Flink in Stand

Re: Flink Elasticsearch Connector: Lucene Error message

2017-07-16 Thread Fabian Wollert
Hi Aljoscha, we are running Flink in Stand alone mode, inside Docker in AWS. I will check tomorrow the dependencies, although i'm wondering: I'm running Flink 1.3 averywhere and the appropiate ES connector which was only released with 1.3, so it's weird where this dependency mix up comes from ...

Re: global window trigger

2017-07-16 Thread Aljoscha Krettek
Hi, Ok, then I misunderstood. Yes, a PurgingTrigger it similar (the same) to always returning FIRE_AND_PURGE instead of FIRE in a custom Trigger. I thought your problem was that data is never cleared away when using GlobalWindows. Is that not the case? Best, Aljoscha > On 14. Jul 2017, at