Hi all,
I have a Flink application where I need to read in AVRO files from s3 which are
partitioned by date and hour. I need to read in multiple dates, meaning I need
to read files from multiple folders. Does anyone know how I can do this? My
application is written in Scala using Flink 1.17.1.
Hello,
I'm using Flink 1.17.1 and I have stateTTL enabled in one of my Flink jobs
where I'm using the RocksDB for checkpointing. I have a value state of Pojo
class (which is generated from Avro schema). I added a new field to my schema
along with the default value to make sure it is backwards c
1319<https://issues.apache.org/jira/browse/FLINK-31319>
Best,
Feng
On Tue, Mar 12, 2024 at 7:21 PM
irakli.keshel...@sony.com<mailto:irakli.keshel...@sony.com>
mailto:irakli.keshel...@sony.com>> wrote:
Hello,
I have a Flink job that is running in the Batch mode. The source for the
Hello,
I have a Flink job that is running in the Batch mode. The source for the job is
a Kafka topic which has limited number of events. I can see that the job starts
running fine and consumes the events, but never makes it past the first task
and becomes idle. The Kafka source is defined to be
ream/event-time/generating_watermarks/#watermark-strategies-and-the-kafka-connector<https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/event-time/generating_watermarks/#watermark-strategies-and-the-kafka-connector>).
On Mar 4, 2024 at 15:54 +0100,
irakli.keshel...@sony.com<mailto
Hello,
I have a Flink job which is processing bounded number of events. Initially, I
was running the job in the "STREAMING" mode, but I realized that running it in
the "BATCH" mode was better as I don't have to deal with the Watermark
Strategy. The job is reading the data from the Kafka topic a
Hello,
I have a Flink application that is consuming events from the Kafka topic and
builds sessions from them. I'm using the Keyed stream. The application runs
fine initially, but after some time it is getting "stuck". I can see that the
"processElement" function is processing the incoming even