Thanks Danny for working on this!

It would be good to do this in a way that the different connectors could
reuse as much code as possible, so if possible put most of the code to the
flink connector shared utils repo [1]

+1 from for the general direction (non-binding)

Thanks,
Peter

[1] https://github.com/apache/flink-connector-shared-utils


Danny Cranmer <dannycran...@apache.org> ezt írta (időpont: 2024. jan. 8.,
H, 17:31):

> Hello all,
>
> I have been working with Péter and Marton on externalizing python
> connectors [1] from the main repo to the connector repositories. We have
> the code moved and the CI running tests for Kafka and AWS Connectors. I am
> now looking into the release process.
>
> When we undertake a Flink release we perform the following steps [2],
> regarding Python: 1/ run python build on CI, 2/ download Wheels artifacts,
> 3/ upload artifacts to the dist and 4/ deploy to pypi. The plan is to
> follow the same steps for connectors, using Github actions instead of Azure
> pipeline.
>
> Today we have a single pypi project for pyflink that contains all the Flink
> libs, apache-flink [3]. I propose we create a new pypi project per
> connector using the existing connector version, and following naming
> convention: apache-<connector-name>, for example:
> apache-flink-connector-aws, apache-flink-connector-kafka. Therefore to use
> a DataStream API connector in python, users would need to first install the
> lib, for example "python -m pip install apache-flink-connector-aws".
>
> Once we have consensus I will update the release process and perform a
> release of the flink-connector-aws project to test it end-to-end. I look
> forward to any feedback.
>
> Thanks,
> Danny
>
> [1] https://issues.apache.org/jira/browse/FLINK-33528
> [2]
> https://cwiki.apache.org/confluence/display/FLINK/Creating+a+Flink+Release
> [3] https://pypi.org/project/apache-flink/
>

Reply via email to