You can check the Oracle CDC connector, which provides that
https://ververica.github.io/flink-cdc-connectors/master/content/connectors/oracle-cdc.html

Best,
G.

On Tue, Feb 13, 2024 at 3:25 PM К В <konstantin.vereten...@gmail.com> wrote:

> Hello!
>
> We need to read data from an Oracle database table in order to pass it to
> Kafka.
> Data is inserted in the table periodically.
> The table has multiple partitions.
> Data should be processed parallel, each task should consume one partition
> in the database.
>
> Can this be done using a Flink task?
> How will Flink determine which records it has already read and which it
> hasn't?
> Can Flink work with composite table partitioning?
> Could you please give an example of how to perform this kind of task?
>
> Best Regards,
> Konstantin.
>

Reply via email to