[ 
https://issues.apache.org/jira/browse/FLINK-6573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17606024#comment-17606024
 ] 

Jiabao Sun commented on FLINK-6573:
-----------------------------------

I noticed that this ticked is reopened.
So I submit a new PR 20848 to support MongoDB stream and sql connector by new 
Source and Sink interface.
 * Support parallel read and write.
 * Support lookup table source.
 * Support scan table source.
 * Support push limit down.
 * Support push projection down.

> Flink MongoDB Connector
> -----------------------
>
>                 Key: FLINK-6573
>                 URL: https://issues.apache.org/jira/browse/FLINK-6573
>             Project: Flink
>          Issue Type: New Feature
>          Components: Connectors / Common
>    Affects Versions: 1.2.0
>         Environment: Linux Operating System, Mongo DB
>            Reporter: Nagamallikarjuna
>            Assignee: ZhuoYu Chen
>            Priority: Not a Priority
>              Labels: pull-request-available, stale-assigned
>         Attachments: image-2021-11-15-14-41-07-514.png
>
>   Original Estimate: 672h
>  Remaining Estimate: 672h
>
> Hi Community,
> Currently we are using Flink in the current Project. We have huge amount of 
> data to process using Flink which resides in Mongo DB. We have a requirement 
> of parallel data connectivity in between Flink and Mongo DB for both 
> reads/writes. Currently we are planning to create this connector and 
> contribute to the Community.
> I will update the further details once I receive your feedback 
> Please let us know if you have any concerns.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to