Great, thanks!
On Wed, Nov 18, 2020 at 18:21 Jark Wu wrote:
> Yes, it works with all the formats supported by the kafka connector.
>
> On Thu, 19 Nov 2020 at 10:18, Slim Bouguerra
> wrote:
>
>> Hi Jark
>> Thanks very much will this work with Avro
>>
>> On Tue, Nov 17, 2020 at 07:44 Jark Wu
Yes, it works with all the formats supported by the kafka connector.
On Thu, 19 Nov 2020 at 10:18, Slim Bouguerra
wrote:
> Hi Jark
> Thanks very much will this work with Avro
>
> On Tue, Nov 17, 2020 at 07:44 Jark Wu wrote:
>
>> Hi Slim,
>>
>> In 1.11, I think you have to implement a custom
Hi Jark
Thanks very much will this work with Avro
On Tue, Nov 17, 2020 at 07:44 Jark Wu wrote:
> Hi Slim,
>
> In 1.11, I think you have to implement a custom FlinkKafkaPartitioner and
> set the class name to 'sink.partitioner' option.
>
> In 1.12, you can re-partition the data by specifying the
Hi Slim,
In 1.11, I think you have to implement a custom FlinkKafkaPartitioner and
set the class name to 'sink.partitioner' option.
In 1.12, you can re-partition the data by specifying the key field (Kafka
producer will partition data by the message key by default). You can do
this by adding
Hi,
I'm pulling in some Flink SQL experts (in CC) to help you with this one :)
Cheers,
Gordon
On Tue, Nov 17, 2020 at 7:30 AM Slim Bouguerra
wrote:
> Hi,
> I am trying to author a SQL job that does repartitioning a Kafka SQL table
> into another Kafka SQL table.
> as example input/output
Hi,
I am trying to author a SQL job that does repartitioning a Kafka SQL table
into another Kafka SQL table.
as example input/output tables have exactly the same SQL schema (see below)
and data the only difference is that the new kafka stream need to be
repartition using a simple project like