Sorry but I'm not really familiar about the interface. As far as I can see,
it's mainly used in the DataSet API. Most of its implementations provide a
TypeSerializer, like `DoubleValueSerializer`. That works the same way as a
KryoSerializer or a customized serializer. So I suppose you don't have
Thanks,
On Thu, Nov 10, 2022 at 6:21 PM Gen Luo wrote
>
> I suppose it would be fine. The only difference I can figure out that may
> matter is the serialization. Flink uses KryoSerializer as the fallback
> serializer if the TypeInformation of the records is not provided, which can
> properly
Hi Davide,
I suppose it would be fine. The only difference I can figure out that may
matter is the serialization. Flink uses KryoSerializer as the fallback
serializer if the TypeInformation of the records is not provided, which can
properly process abstract classes. This works well in most cases.
Greetings,
I am looking at Flink pipeline processing events consumed from a Kafka
topic, which now needs to also consume events which have a different, but
related, schema. Traditional Java OOP would suggest transitioning from
class Dog { ... }
new FilterFunction { ... }
to
abstract class