meetjunsu commented on a change in pull request #17542:
URL: https://github.com/apache/flink/pull/17542#discussion_r761629258



##########
File path: 
flink-formats/flink-parquet/src/main/java/org/apache/flink/formats/parquet/row/ParquetRowDataWriter.java
##########
@@ -224,6 +277,153 @@ private TimestampWriter(int precision) {
         public void write(RowData row, int ordinal) {
             
recordConsumer.addBinary(timestampToInt96(row.getTimestamp(ordinal, 
precision)));
         }
+
+        @Override
+        public void write(ArrayData arrayData, int ordinal) {
+            
recordConsumer.addBinary(timestampToInt96(arrayData.getTimestamp(ordinal, 
precision)));
+        }
+    }
+
+    /** It writes a map field to parquet, both key and value are nullable. */
+    private class MapWriter implements FieldWriter {
+
+        private String repeatedGroupName;
+        private String keyName, valueName;
+        private FieldWriter keyWriter, valueWriter;
+
+        private MapWriter(LogicalType keyType, LogicalType valueType, 
GroupType groupType) {
+            // Get the internal map structure (MAP_KEY_VALUE)
+            GroupType repeatedType = groupType.getType(0).asGroupType();
+            this.repeatedGroupName = repeatedType.getName();
+
+            // Get key element information
+            Type type = repeatedType.getType(0);
+            this.keyName = type.getName();
+            this.keyWriter = createWriter(keyType, type);
+
+            // Get value element information
+            Type valuetype = repeatedType.getType(1);
+            this.valueName = valuetype.getName();
+            this.valueWriter = createWriter(valueType, valuetype);
+        }
+
+        @Override
+        public void write(RowData row, int ordinal) {
+            recordConsumer.startGroup();
+
+            MapData mapData = row.getMap(ordinal);
+
+            if (mapData != null && mapData.size() > 0) {
+                recordConsumer.startField(repeatedGroupName, 0);
+
+                ArrayData keyArray = mapData.keyArray();
+                ArrayData valueArray = mapData.valueArray();
+                for (int i = 0; i < keyArray.size(); i++) {
+                    recordConsumer.startGroup();
+                    // write key element
+                    recordConsumer.startField(keyName, 0);

Review comment:
       update as you say, but the current version does not affect the results




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to