Hi David,

Thanks for the clarification. I will check the link you shared. Also, as
mentioned by Dominik, can you help me with the process functions. How can I
use it for my use case?

Thanks,
Siddhesh


On Wed, Dec 29, 2021 at 3:22 PM Siddhesh Kalgaonkar <
kalgaonkarsiddh...@gmail.com> wrote:

> Hi David,
>
> Thanks for the clarification. I will check the link you shared. Also, as
> mentioned by Dominik, can you help me with the process functions. How can I
> use it for my use case?
>
> Thanks,
> Siddhesh
>
> On Wed, Dec 29, 2021 at 2:50 PM David Morávek <d...@apache.org> wrote:
>
>> Hi Siddhesh,
>>
>> it seems that the question is already being answered in the SO thread, so
>> let's keep the discussion focused there.
>>
>> Looking at the original question, I think it's important to understand,
>> that the TypeInformation is not meant to be used for "runtime" matching,
>> but to address the type erasure [1] limitation for the UDFs (user defined
>> functions), so Flink can pick the correct serializer / deserializer.
>>
>> [1] https://docs.oracle.com/javase/tutorial/java/generics/erasure.html
>>
>> Best,
>> D.
>>
>> On Tue, Dec 28, 2021 at 9:21 PM Siddhesh Kalgaonkar <
>> kalgaonkarsiddh...@gmail.com> wrote:
>>
>>> Hi Team,
>>>
>>> I am a newbie to Flink and Scala and trying my best to learn everything
>>> I can. I doing a practice where  I am getting incoming JSON data from the
>>> Kafka topic and want to perform a data type check on it.
>>> For that, I came across TypeInformation of Flink. Please read my problem
>>> in detail from the below link:
>>>
>>> Flink Problem
>>> <https://stackoverflow.com/questions/70500023/typeinformation-in-flink-to-compare-the-datatypes-dynamically>
>>>
>>> I went through the documentation but didn't come across any relevant
>>> examples. Any suggestions would help.
>>>
>>> Looking forward to hearing from you.
>>>
>>>
>>> Thanks,
>>> Siddhesh
>>>
>>

Reply via email to