rdblue commented on code in PR #11839:
URL: https://github.com/apache/iceberg/pull/11839#discussion_r1925832536
##########
flink/v1.20/flink/src/test/java/org/apache/iceberg/flink/data/TestFlinkParquetReader.java:
##########
@@ -236,4 +457,44 @@ protected void writeAndValidate(Schema schema) throws
IOException {
RandomGenericData.generateFallbackRecords(schema, NUM_RECORDS, 21124,
NUM_RECORDS / 20),
schema);
}
+
+ @Override
+ protected void writeAndValidate(Schema writeSchema, Schema expectedSchema)
throws IOException {
+ assumeThat(
+ TypeUtil.find(
+ writeSchema,
+ type -> type.isMapType() && type.asMapType().keyType() !=
Types.StringType.get()))
+ .as("Parquet Avro cannot write non-string map keys")
+ .isNull();
Review Comment:
Why is this using parquet-avro to read when the version of
`writeAndValidate` just above does not have this restriction? There are
assertions to validate `Record` with `RowData` so I think this should use the
same approach.
That would also allow you to share the implementation of both
`writeAndValidate` versions, as was done in the other cases.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]