Hi Soheil,

I assume that you are using `DataStream` API. Please check the document [1]
to get more information. Other guys said a lot about this.

Regardless the interfaces, I'm just wondering how could you read a Mysql
table "continuously"?
Kafka can be used as a message queue which is convenient to get the
incremental messages. How do you plan to do that based on Mysql table?
Through binary log?

1.
https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/datastream_api.html#data-sources


Caizhi Weng <tsreape...@gmail.com> 于2019年7月16日周二 上午10:12写道:

> Hi Soheil,
>
> It's not recommended to implement a streaming source using `InputFormat`
> (it's mainly used for batch source). To implement a streaming source,
> `SourceFunction` is recommended.
>
> It's clearly written (with examples) in the java docs in `SourceFucntion`
> how to write a `run` and `cancel` method. You can refer to that to write
> your own MySQL streaming source.
>
> Soheil Pourbafrani <soheil.i...@gmail.com> 于2019年7月16日周二 上午7:29写道:
>
>> Hi,
>>
>> Extending the "RichInputFormat" class I could create my own MySQL input.
>> I want to use it for reading data continuously from a table but I observed
>> that the "RichInputFormat" class read all data and finish the job.
>>
>> I guess for reading data continuously I need to extend the
>> "SourceFunction" but I observed that it has only two methods: the run() and
>> the cancel()
>>
>> So I was wondering is it possible to implement a new class to read data
>> from MySQL tables continuously? Like what we can do with Kafka connector
>>
>> Thanks
>>
>

Reply via email to