Re: kafka corrupt record exception

2019-04-25 Thread Dominik Wosiński
Hey, Sorry for such a delay, but I have missed this message. Basically, technically you could have Kafka broker installed in version say 1.0.0 and using FlinkKafkaConsumer08. This could technically create issues. I'm not sure if You can automate the process of skipping corrupted messages, as You

Re: kafka corrupt record exception

2019-04-08 Thread Sushant Sawant
Hi, yes exactly, I am already using custom deserialization schema. Currently, doing same, checking for corrupt record offset and starting consumer from next offset. But then there is need of continous monitoring and find corrupt record manually. Any idea how could I build program for this. And cou

Re: kafka corrupt record exception

2019-04-02 Thread Dominik Wosiński
Hey, As far as I understand the error is not caused by the deserialization but really by the polling of the message, so custom deserialization schema won't really help in this case. There seems to be an error in the messages in Your topic. You can see here

Re: kafka corrupt record exception

2019-04-02 Thread Ilya Karpov
According to docs (here: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#the-deserializationschema , last paragraph) that’s an expected behaviour. May be

Re: kafka corrupt record exception

2019-04-01 Thread Sushant Sawant
Hi, Thanks for reply. But is there a way one could skip this corrupt record from Flink consumer? Flink job goes in a loop, it restarts and then fails again at same record. On Mon, 1 Apr 2019, 07:34 Congxian Qiu, wrote: > Hi > As you said, consume from ubuntu terminal has the same error, maybe y

Re: kafka corrupt record exception

2019-03-31 Thread Congxian Qiu
Hi As you said, consume from ubuntu terminal has the same error, maybe you could send a email to kafka user maillist. Best, Congxian On Apr 1, 2019, 05:26 +0800, Sushant Sawant , wrote: > Hi team, > I am facing this exception, > org.apache.kafka.common.KafkaException: Received exception when fet

kafka corrupt record exception

2019-03-31 Thread Sushant Sawant
Hi team, I am facing this exception, org.apache.kafka.common.KafkaException: Received exception when fetching the next record from topic_log-3. If needed, please seek past the record to continue consumption. at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchReco