[ 
https://issues.apache.org/jira/browse/FLINK-37687?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17954527#comment-17954527
 ] 

Hong Liang Teoh commented on FLINK-37687:
-----------------------------------------

Based on a PoC of the changes, we would need to solve the below problems:

 
 # *Flink public* *Interface changes.* Sink.InitContext has been renamed to 
WriterInitContext. ParameterTools base package has been migrated.
 ** *Impact:* Every Sink connector exposes Sink.InitContext as a public 
interface, because this is exposed in the method signature of the abstract 
class, AsyncSinkBase. This means a breaking change for the connectors 
repository to support Flink 2.x
 ** *Proposal:* I don't see a simple way to decouple the AWS repository code 
from being tied to either one of Flink 1.x and Flink 2.x. As such I suggest 
that we split the branches, aws-connector-5.x supports flink-1.20.x and 
aws-connector-6.x for flink-2.x. This means we have to maintain two active 
branches (since 1.20 is LTS), but I don't see a better way around this!
 # *Removal of SinkFunction, SourceFunction interfaces.*
 ** *Impact:* FlinkKinesisConsumer, FlinkKinesisProducer have to be removed.
 ** *Proposal:* Propose that we remove the above, along with the SQL 
counterparts, as we have replacements of KinesisStreamsSource and 
KinesisStreamsSink as replacements.
 ** *Proposal:* E2E tests and GSR/JSON format tests currently rely on these 
interfaces, so they will need refactoring as well.
 # *Removal of support for JDK 8, and JDK 17 is now the recommended Java 
version*
 ** *Impact/Proposal:* CI/CD can remove testing for JDK 8, and ensure we test 
against JDK 17
 # *Kryo is disabled by default on Flink 2*
 ** {*}Impact/Proposal{*}{*}:{*} We should sanity check that none of the new 
states falls back to Kryo. IF there are any, we should make it such that we can 
use POJOSerializer instead of Kryo.

Couple of areas that still need to be checked:

1/ Glue-schema-registry formats : Are there any breaking changes in the table 
schema interface that needs tidying up / refactoring?

2/ Python : What is the supported Python version? We should tweak this to 
account for that.

 

 

> Bump flink-connector-aws to 2.0
> -------------------------------
>
>                 Key: FLINK-37687
>                 URL: https://issues.apache.org/jira/browse/FLINK-37687
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / AWS, Connectors / DynamoDB, Connectors / 
> Firehose, Connectors / Kinesis
>    Affects Versions: 2.0.0
>            Reporter: Keith Lee
>            Priority: Major
>             Fix For: aws-connector-5.1.0
>
>
> Flink 2.0 has breaking changes on interfaces that connectors implement e.g. 
> Sink 
> ```
> Error:  
> /home/runner/work/flink-connector-aws/flink-connector-aws/flink-connector-aws/flink-connector-dynamodb/src/main/java/org/apache/flink/connector/dynamodb/sink/DynamoDbBeanElementConverter.java:[67,26]
>  cannot find symbol
> Error:    symbol:   class InitContext
> Error:    location: interface org.apache.flink.api.connector.sink2.Sink
> Error:  
> /home/runner/work/flink-connector-aws/flink-connector-aws/flink-connector-aws/flink-connector-dynamodb/src/main/java/org/apache/flink/connector/dynamodb/sink/DynamoDbSinkWriter.java:[160,5]
>  method does not override or implement a method from a supertype
> ```
> 1. We need to discuss strategy on maintaining branches for 1.20 LTS and also 
> 2.x
> 2. We need to work on migrating current sources/sinks to 2.x on 2.x branch



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to