Hi people,
I found a workaround for that issue - which works at least for my use case.
The main idea was customizing
"org.apache.flink.formats.avro.registry.confluent.RegistryAvroFormatFactory"
such that the expected avro schema is not gained from the CREATE TABLE SQL
statement, rather than pa
Hi Peter,
don't get confused by the year 2017 in the ticket. We had better Avro
support in the meantime but this was based on the old type system around
TypeInformation. Now we need to build up this support again for the new
type system. I just found this ticket and found that the title fits.
Hi Peter,
as a temporary workaround I would simply implement a UDF like:
public class EverythingToString extends ScalarFunction {
public String eval(@DataTypeHint(inputGroup = ANY) Object o) {
return o.toString();
}
}
For the Utf8 issue, you can instruct Avro to generate Java classe
Hi Timo,
thanks a lot for your suggestion.
I also considered this workaround but when going from DataStreams API to
Table API (using the POJO generated by maven avro plugin) types are not
mapped correctly, esp. UTF8 (avros implementation of CharSquence) and also
enums. In the table I have then mo
A current workaround is to use DataStream API to read the data and
provide your custom Avro schema to configure the format. Then switch to
Table API.
StreamTableEnvironment.fromDataStream(...) accepts all data types. Enum
classes will be represented as RAW types but you can forward them as
bl
Hi people!
I was digging deeper this days and found the "root cause" of the issue and the
difference between avro reading from files and avro reading from Kafka & SR.
plz see:
https://lists.apache.org/x/thread.html/r8ad7bd574f7dc4904139295c7de612a35438571c5b9caac673521d22@%3Cuser.flink.apache.o
Hi community,
Can I get advice on this question?
Another user just sent me an email asking whether I found a solution or a
workaround for this question, but I'm still stuck there.
Any suggestions?
Thanks in advance,
Dongwon
-- Forwarded message -
From: Dongwon Kim
Date: Mon,