Hi all,

I'm wondering if it's possible and what's the best way to achieve the
loading of multiple files with a Json source to a JDBC sink ?
I'm running Flink 1.7.0

Let's say I have about 1500 files with the same structure (same format,
schema, everything) and I want to load them with a *batch* job
Can Flink handle the loading of one and each file in a single source and
send data to my JDBC sink?
I wish I can provide the URL of the directory containing my thousand files
to the batch source to make it load all of them sequentially.
My sources and sinks are currently available for BatchTableSource, I guess
the cost to make them available for streaming would be quite expensive for
me for the moment.

Have someone ever done this?
Am I wrong to expect doing so with a batch job?

All the best

François Lacombe

-- 

 <http://www.dcbrain.com/>   <https://twitter.com/dcbrain_feed?lang=fr>   
<https://www.linkedin.com/company/dcbrain>   
<https://www.youtube.com/channel/UCSJrWPBLQ58fHPN8lP_SEGw>


 Pensez à la 
planète, imprimer ce papier que si nécessaire 

Reply via email to