No, I think that should be all right.
On 06.05.20 16:57, Vishwas Siravara wrote:
Thanks I figured that would be the case. I m using the flink tuple type in
the map functions ,so there is no casting required now. Can you think of
any downsides of using flink tuples in scala code, especially since
Hi,
Flink will not do any casting between types. You either need to output
to correct (Scala) Tuple type from the deserialization schema or insert
a step (say a map function) that converts between the two types. The
Tuple2 type and the Scala tuple type, i.e. (foo, bar) have nothing in
common
Hi guys,
In our flink job we use java source for deserializing a message from kafka
using a kafka deserializer. Signature is as below.
public class CustomAvroDeserializationSchema implements
KafkaDeserializationSchema>
The other parts of the streaming job are in scala. When data has to b