ConeyLiu commented on code in PR #1173:
URL: https://github.com/apache/parquet-mr/pull/1173#discussion_r1381424815


##########
parquet-hadoop/src/main/java/org/apache/parquet/hadoop/rewrite/ParquetRewriter.java:
##########
@@ -324,15 +324,20 @@ private void processBlocksFromReader(IndexCache 
indexCache) throws IOException {
 
           // Translate compression and/or encryption
           writer.startColumn(descriptor, 
crStore.getColumnReader(descriptor).getTotalValueCount(), newCodecName);
-          processChunk(
+          boolean needOverwriteStatistics = processChunk(
                   chunk,
                   newCodecName,
                   columnChunkEncryptorRunTime,
                   encryptColumn,
                   indexCache.getBloomFilter(chunk),
                   indexCache.getColumnIndex(chunk),
                   indexCache.getOffsetIndex(chunk));
-          writer.endColumn();
+          if (needOverwriteStatistics) {
+            // All the column statistics are invalid, so we need to overwrite 
the column statistics
+            writer.endColumn(chunk.getStatistics());

Review Comment:
   I think that would be better. And just need to call the API: `public void 
invalidateStatistics(Statistics<?> totalStatistics)` in the `processChunk`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to