exceptionfactory commented on code in PR #10850:
URL: https://github.com/apache/nifi/pull/10850#discussion_r2771304057


##########
nifi-extension-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/main/java/org/apache/nifi/avro/AvroTypeUtil.java:
##########
@@ -1171,10 +1179,22 @@ private static Object normalizeValue(final Object 
value, final Schema avroSchema
 
                 final String logicalName = logicalType.getName();
                 if (LOGICAL_TYPE_TIME_MICROS.equals(logicalName)) {
+                    // Handle both Long (legacy) and LocalTime (newer Avro 
libraries)
+                    if (value instanceof LocalTime localTime) {
+                        return new Time(localTime.toNanoOfDay() / 1_000_000);

Review Comment:
   Same as above, can `Time.valueOf(localTime)` be used?



##########
nifi-extension-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/main/java/org/apache/nifi/avro/AvroTypeUtil.java:
##########
@@ -1155,9 +1155,17 @@ private static Object normalizeValue(final Object value, 
final Schema avroSchema
                 final String logicalName = logicalType.getName();
                 if (LOGICAL_TYPE_DATE.equals(logicalName)) {
                     // date logical name means that the value is number of 
days since Jan 1, 1970
+                    // Handle both Integer (legacy) and LocalDate (newer Avro 
libraries)
+                    if (value instanceof LocalDate localDate) {
+                        return java.sql.Date.valueOf(localDate);
+                    }
                     return java.sql.Date.valueOf(LocalDate.ofEpochDay((int) 
value));
                 } else if (LOGICAL_TYPE_TIME_MILLIS.equals(logicalName)) {
                     // time-millis logical name means that the value is number 
of milliseconds since midnight.
+                    // Handle both Integer (legacy) and LocalTime (newer Avro 
libraries)
+                    if (value instanceof LocalTime localTime) {
+                        return new Time(localTime.toNanoOfDay() / 1_000_000);

Review Comment:
   It looks like this can be simplified to `Time.valueOf(localTime)`



##########
nifi-extension-bundles/nifi-parquet-bundle/nifi-parquet-processors/src/test/java/org/apache/nifi/parquet/TestParquetReader.java:
##########
@@ -217,6 +222,189 @@ public void testReaderWithAddListElementRecordsDisabled() 
throws IOException, In
                 "MapRecord[{name=Bob, favorite_number=1, 
favorite_colors=[blue, red, yellow]}]");
     }
 
+    /**
+     * Test for NIFI-15548: ParquetReader fails on timestamps.
+     * This test programmatically creates a parquet file with timestamp_millis 
logical type
+     * and verifies that the timestamp values can be read without 
ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithTimestampMillis() throws IOException, 
MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createTimestampParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        // The file should contain the expected number of records
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        // Verify each record can be read and contains valid timestamp values
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+            assertEquals("Record" + i, record.getValue("name"));
+
+            // The timestamp should be readable without ClassCastException
+            final Object timestampValue = record.getValue("created_at");
+            assertNotNull(timestampValue, "Timestamp value should not be null 
for record " + i);
+        }
+    }
+
+    /**
+     * Test for NIFI-15548: ParquetReader fails on timestamps with nullable 
union types.
+     * This test programmatically creates a parquet file with nullable 
timestamp_millis logical type
+     * and verifies that the timestamp values can be read without 
ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithNullableTimestampMillis() throws 
IOException, MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createNullableTimestampParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        // The file should contain the expected number of records
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        // Verify each record can be read and contains valid timestamp values
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+            assertEquals("Record" + i, record.getValue("name"));
+
+            // The timestamp should be readable without ClassCastException
+            final Object timestampValue = record.getValue("created_at");
+            assertNotNull(timestampValue, "Timestamp value should not be null 
for record " + i);
+        }
+    }
+
+    /**
+     * Test for date logical type.
+     * Verifies that parquet files with date logical type can be read without 
ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithDateLogicalType() throws IOException, 
MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createDateParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+
+            final Object dateValue = record.getValue("date_field");
+            assertNotNull(dateValue, "Date value should not be null for record 
" + i);
+        }
+    }
+
+    /**
+     * Test for time-millis logical type.
+     * Verifies that parquet files with time-millis logical type can be read 
without ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithTimeMillisLogicalType() throws IOException, 
MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createTimeMillisParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+
+            final Object timeValue = record.getValue("time_millis_field");
+            assertNotNull(timeValue, "Time-millis value should not be null for 
record " + i);
+        }
+    }
+
+    /**
+     * Test for time-micros logical type.
+     * Verifies that parquet files with time-micros logical type can be read 
without ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithTimeMicrosLogicalType() throws IOException, 
MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createTimeMicrosParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+
+            final Object timeValue = record.getValue("time_micros_field");
+            assertNotNull(timeValue, "Time-micros value should not be null for 
record " + i);
+        }
+    }
+
+    /**
+     * Test for timestamp-micros logical type.
+     * Verifies that parquet files with timestamp-micros logical type can be 
read without ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithTimestampMicrosLogicalType() throws 
IOException, MalformedRecordException {
+        final int numRecords = 5;
+        final File parquetFile = 
ParquetTestUtils.createTimestampMicrosParquetFile(numRecords, tempDir);
+
+        final List<Record> results = getRecords(parquetFile, emptyMap());
+
+        assertNotNull(results);
+        assertEquals(numRecords, results.size());
+
+        for (int i = 0; i < numRecords; i++) {
+            final Record record = results.get(i);
+            assertNotNull(record);
+            assertEquals(i, record.getValue("id"));
+
+            final Object timestampValue = 
record.getValue("timestamp_micros_field");
+            assertNotNull(timestampValue, "Timestamp-micros value should not 
be null for record " + i);
+        }
+    }
+
+    /**
+     * Comprehensive test for all temporal logical types.
+     * Verifies that parquet files with date, time-millis, time-micros, 
timestamp-millis,
+     * and timestamp-micros logical types can all be read without 
ClassCastException.
+     */
+    @Test
+    public void testReadParquetWithAllTemporalLogicalTypes() throws 
IOException, MalformedRecordException {

Review Comment:
   Instead of having test methods for individual types, and the associated 
utility methods, it looks like this method can cover all the expected Timestamp 
and Date types.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to