Hey,

thank you for your reply. Your converter looks very interesting. However, Flink 
comes with the JsonRowSchemaConverter that converts a JSONSchema-String to a 
TypeInformation already. From there you can convert the TypeInformation to, 
say, a DataType (Although I must admit I only got this done using deprecated 
methods in Flink). I am struggling to get the reverse way done - converting 
from a Flink ResolvedSchema (or LogicalType, or DataType) to a JSONSchema. Is 
that something you want to implement in your converter as well?

Your project is encouraging me though, maybe I will try to implement DataType 
to JSONSchema and ProtobufSchema to DataType (and the reverse) myself, given I 
do not find anything that does the trick.

-Theo

> On 9. Nov 2022, at 14:46, Andrew Otto <o...@wikimedia.org> wrote:
> 
> Hello! 
> 
> I see you are talking about JSONSchema, not just JSON itself.
> 
> We're trying to do a similar thing at Wikimedia and have developed some 
> tooling around this.  
> 
> JsonSchemaFlinkConverter 
> <https://gerrit.wikimedia.org/r/plugins/gitiles/wikimedia-event-utilities/+/refs/heads/master/eventutilities-flink/src/main/java/org/wikimedia/eventutilities/flink/formats/json/JsonSchemaFlinkConverter.java>
>  has some logic to convert from JSONSchema Jackson ObjectNodes to Flink Table 
> DataType or Table SchemaBuilder, or Flink DataStream TypeInformation[Row].  
> Some of the conversions from JSONSchema to Flink type are opinionated.  You 
> can see the mappings here 
> <https://gerrit.wikimedia.org/r/plugins/gitiles/wikimedia-event-utilities/+/refs/heads/master/eventutilities-flink/src/main/java/org/wikimedia/eventutilities/flink/formats/json/DataTypeSchemaConversions.java>.
> 
> 
> 
> 
> 
> 
> 
> On Wed, Nov 9, 2022 at 2:33 AM Theodor Wübker <theo.wueb...@inside-m2m.de 
> <mailto:theo.wueb...@inside-m2m.de>> wrote:
> Thanks for your reply Yaroslav! The way I do it with Avro seems similar to 
> what you pointed out:
> ResolvedSchema resultSchema = resultTable.getResolvedSchema();
> DataType type = resultSchema.toSinkRowDataType();
> org.apache.avro.Schema converted = 
> AvroSchemaConverter.convertToSchema(type.getLogicalType());
> I mentioned the ResolvedSchema because it is my starting point after the SQL 
> operation. It seemed to me that I can not retrieve something that contains 
> more schema information from the table so I got myself this. About your other 
> answers: It seems the classes you mentioned can be used to serialize actual 
> Data? However this is not quite what I want to do.
> Essentially I want to convert the schema of a Flink table to both Protobuf 
> schema and JSON schema (for Avro as you can see I have it already). It seems 
> odd that this is not easily possible, because converting from a JSON schema 
> to a Schema of Flink is possible using the JsonRowSchemaConverter. However 
> the other way is not implemented it seems. This is how I got a Table Schema 
> (that I can use in a table descriptor) from a JSON schema:
> 
> TypeInformation<Row> type = JsonRowSchemaConverter.convert(json);
> DataType row = TableSchema.fromTypeInfo(type).toPhysicalRowDataType();
> Schema schema = Schema.newBuilder().fromRowDataType(row).build();
> Sidenote: I use deprecated methods here, so if there is a better approach 
> please let me know! But it shows that in Flink its easily possible to create 
> a Schema for a TableDescriptor from a JSON Schema - the other way is just not 
> so trivial it seems. And for Protobuf so far I don’t have any solutions, not 
> even creating a Flink Schema from a Protobuf Schema - not to mention the 
> other way around.
> 
> -Theo
> 
> (resent because I accidentally only responded to you, not the Mailing list - 
> sorry)
> 

Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to