yihua opened a new issue, #14224:
URL: https://github.com/apache/hudi/issues/14224

   ### Bug Description
   
   **What happened:**
   We see a lot of error logs in GH CI runs, when trying to read log files for 
getting column range metadata: 
https://github.com/apache/hudi/actions/runs/19122669332/job/54649750254
   ```
   775596 [Executor task launch worker for task 0.0 in stage 60.0 (TID 225)] 
ERROR org.apache.hudi.common.table.log.BaseHoodieLogRecordReader [] - Got 
exception when reading log file
   org.apache.hudi.exception.CorruptedLogFileException: 
HoodieLogFile{pathStr='file:/tmp/junit-481472275906633469/dataset/p1/.055a4a3b-aacc-4bb2-a64a-a1214b0bb1e8_0000008.log.0_1-0-1',
 fileLen=-1} could not be read. Did not find the magic bytes at the start of 
the block
        at 
org.apache.hudi.common.table.log.HoodieLogFileReader.readMagic(HoodieLogFileReader.java:363)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieLogFileReader.hasNext(HoodieLogFileReader.java:347)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieLogFormatReader.hasNext(HoodieLogFormatReader.java:83)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.BaseHoodieLogRecordReader.scanInternalV1(BaseHoodieLogRecordReader.java:226)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.BaseHoodieLogRecordReader.scanInternal(BaseHoodieLogRecordReader.java:203)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordReader.performScan(HoodieMergedLogRecordReader.java:100)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordReader.<init>(HoodieMergedLogRecordReader.java:75)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordReader.<init>(HoodieMergedLogRecordReader.java:55)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordReader$Builder.build(HoodieMergedLogRecordReader.java:276)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.read.buffer.LogScanningRecordBufferLoader.scanLogFiles(LogScanningRecordBufferLoader.java:53)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.read.buffer.DefaultFileGroupRecordBufferLoader.getRecordBuffer(DefaultFileGroupRecordBufferLoader.java:81)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.read.HoodieFileGroupReader.initRecordIterators(HoodieFileGroupReader.java:134)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.read.HoodieFileGroupReader.getBufferedRecordIterator(HoodieFileGroupReader.java:291)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.common.table.read.HoodieFileGroupReader.getClosableHoodieRecordIterator(HoodieFileGroupReader.java:307)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieTableMetadataUtil.getLogFileColumnRangeMetadata(HoodieTableMetadataUtil.java:1784)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieTableMetadataUtil.readColumnRangeMetadataFrom(HoodieTableMetadataUtil.java:1741)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieTableMetadataUtil.getFileStatsRangeMetadata(HoodieTableMetadataUtil.java:2697)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieTableMetadataUtil.translateWriteStatToFileStats(HoodieTableMetadataUtil.java:2791)
 ~[hudi-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieMetadataWriteUtils.lambda$getFilesToFetchColumnStats$9(HoodieMetadataWriteUtils.java:565)
 ~[hudi-client-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:269) 
~[?:1.8.0_472]
        at 
java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) 
~[?:1.8.0_472]
        at 
java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) 
~[?:1.8.0_472]
        at 
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) 
~[?:1.8.0_472]
        at 
java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) 
~[?:1.8.0_472]
        at 
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) 
~[?:1.8.0_472]
        at 
java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) 
~[?:1.8.0_472]
        at 
org.apache.hudi.metadata.HoodieMetadataWriteUtils.getFilesToFetchColumnStats(HoodieMetadataWriteUtils.java:565)
 ~[hudi-client-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.metadata.HoodieMetadataWriteUtils.lambda$convertMetadataToPartitionStatRecords$4f6e7b3a$1(HoodieMetadataWriteUtils.java:481)
 ~[hudi-client-common-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.hudi.data.HoodieJavaRDD.lambda$mapToPair$aa72055d$1(HoodieJavaRDD.java:178)
 ~[hudi-spark-client-1.2.0-SNAPSHOT.jar:1.2.0-SNAPSHOT]
        at 
org.apache.spark.api.java.JavaPairRDD$.$anonfun$pairFunToScalaFun$1(JavaPairRDD.scala:1073)
 ~[spark-core_2.12-3.4.3.jar:3.4.3]
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461) 
~[scala-library-2.12.17.jar:?]
        at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486) 
~[scala-library-2.12.17.jar:?]
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492) 
~[scala-library-2.12.17.jar:?]
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)
 ~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
 ~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:101) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at org.apache.spark.scheduler.Task.run(Task.scala:139) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:554)
 ~[spark-core_2.12-3.4.3.jar:3.4.3]
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1529) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:557) 
~[spark-core_2.12-3.4.3.jar:3.4.3]
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
~[?:1.8.0_472]
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
~[?:1.8.0_472]
        at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_472]
   ```
   
   **What you expected:**
   There should be no error.
   
   **Steps to reproduce:**
   Run GH CI on master or relevant tests locally
   
   
   ### Environment
   
   **Hudi version:** master
   **Query engine:** Spark
   **Relevant configs:** out-of-the-box
   
   
   ### Logs and Stack Trace
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to