Thanks haicheng for your work.

1. Merging MCP-related services means that our future API adaptations
must also adapt to MCP, otherwise MCP will soon become outdated.
2. Introducing a feature will also introduce a lot of maintenance
work, which may last for several years.
3. Maybe it would be more appropriate for us to maintain related codes
in a new repository and let it have its own release rhythm, just like
seatunnel-web.

The premise is that the community decides that we should provide an
official MCP service, rather than letting downstream implement it.

haicheng zhang <[email protected]> 于2025年3月31日周一 15:16写道:
>
> Dear Apache SeaTunnel developer community members,
>
> Hello everyone!
>
> I am very excited to announce that after unremitting efforts, I have
> successfully developed an MCP (A Model Context Protocol (MCP) server for
> interacting with SeaTunnel through LLM interfaces like Claude.) service for
> Apache SeaTunnel. This innovation not only enriches the feature set of
> SeaTunnel, but also aims to improve the flexibility and efficiency of data
> processing and integration.
>
> In order to contribute this achievement to the entire community and promote
> the continuous development of SeaTunnel, I have submitted a relevant Pull
> Request (PR) on GitHub. The detailed link is as follows:
>
> PR address: https://github.com/apache/seatunnel/pull/9086
> In addition, the proposal of this PR is based on the needs and plans
> discussed previously. The specific discussion background can be found in
> the relevant Issue:
>
> Issue address: https://github.com/apache/seatunnel/issues/9047
>
> Here, I sincerely invite all community members, contributors and project
> maintainers to participate in the discussion and review of this PR. Your
> valuable opinions, technical insights, and suggestions for functional
> perfection will have an important impact on the final form of this service.
> Whether it is the optimization of code logic, the supplement of test cases,
> or the improvement of documentation, every feedback is an important force
> for us to jointly promote the development of SeaTunnel.
>
> Looking forward to your active participation, let us work together to make
> Apache SeaTunnel more perfect and powerful!
>
> Thank you for your continued support and contribution!
>
> Sincerely,
>
> [ocean-zhc]
> [[email protected]/ocean-zhc]
> [2025/03/31]
>
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> 亲爱的Apache SeaTunnel开发者社区成员们,
>
> 大家好!
>
> 我非常兴奋地向大家宣布,经过不懈努力,我已成功开发完成了一项针对Apache SeaTunnel的MCP(A Model Context
> Protocol (MCP) server for interacting with SeaTunnel through LLM interfaces
> like Claude.)服务。这一创新不仅丰富了SeaTunnel的功能集,还旨在提升数据处理与集成的灵活性和效率。
>
> 为了将这一成果贡献给整个社区,促进SeaTunnel的持续发展,我已在GitHub上提交了相关的Pull Request(PR),详情链接如下:
>
> PR地址:https://github.com/apache/seatunnel/pull/9086
> 此外,该PR的提出基于之前讨论的需求与规划,具体讨论背景可查阅相关Issue:
>
> Issue地址:https://github.com/apache/seatunnel/issues/9047
>
> 在此,我诚挚邀请各位社区成员、贡献者及项目维护者参与此次PR的讨论与评审。您的宝贵意见、技术见解以及对功能完善性的建议,都将对这项服务的最终形态产生重要影响。无论是对代码逻辑的优化、测试用例的补充,还是对文档说明的改进,每一项反馈都是我们共同推动SeaTunnel向前发展的重要力量。
>
> 期待您的积极参与,让我们携手将Apache SeaTunnel打造得更加完善与强大!
>
> 感谢大家一直以来的支持与贡献!
>
> 此致
> 敬礼!
>
>
> [ocean-zhc]
> [[email protected]/ocean-zhc]
> [2025/03/31]

Reply via email to