Hello!

We need to read data from an Oracle database table in order to pass it to
Kafka.
Data is inserted in the table periodically.
The table has multiple partitions.
Data should be processed parallel, each task should consume one partition
in the database.

Can this be done using a Flink task?
How will Flink determine which records it has already read and which it
hasn't?
Can Flink work with composite table partitioning?
Could you please give an example of how to perform this kind of task?

Best Regards,
Konstantin.

Reply via email to