[ 
https://issues.apache.org/jira/browse/HUDI-5758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ethan Guo updated HUDI-5758:
----------------------------
    Sprint: Sprint 2023-01-31

> MOR table w/ delete block in 0.12.2 not readable in 0.13 and also not 
> compactable
> ---------------------------------------------------------------------------------
>
>                 Key: HUDI-5758
>                 URL: https://issues.apache.org/jira/browse/HUDI-5758
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: reader-core, writer-core
>    Affects Versions: 0.13.0
>            Reporter: sivabalan narayanan
>            Assignee: Alexey Kudinkin
>            Priority: Blocker
>              Labels: pull-request-available
>             Fix For: 0.13.0
>
>
> If we have a Delete block in MOR log blocks in 0.12.2 hudi version, read from 
> 0.13.0 fails due to Kryo serialization/deser. In similar sense compaction 
> also does not work. 
>  
> Set of users who might be impacted w/ this:
> Those who are using MOR table and has 
> uncompacted file groups which has Delete blocks. 
> Delete blocks are possible only in following scenarios:
> a. Delete operation
> b. GLOBAL_INDEX + update partition path = true. Chances that it could result 
> in delete blocks. 
>  
> Root cause:
> HoodieKey was made KryoSerializable as part of RFC46, but guess missed to 
> register.
>  
> {code:java}
>  spark.sql("select * from hudi_trips_snapshot ").show(100, false)
> 23/02/09 16:53:43 WARN ObjectStore: Failed to get database global_temp, 
> returning NoSuchObjectException
> 19:02  WARN: [kryo] Unable to load class 7e51db6-6033-4794-ac59-44a930424b2b 
> with kryo's ClassLoader. Retrying with current..
> 23/02/09 16:53:44 ERROR AbstractHoodieLogRecordReader: Got exception when 
> reading log file
> com.esotericsoftware.kryo.KryoException: Unable to find class: 
> 7e51db6-6033-4794-ac59-44a930424b2b
> Serialization trace:
> orderingVal (org.apache.hudi.common.model.DeleteRecord)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>       at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
>       at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
>       at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
>       at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:391)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:302)
>       at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
>       at 
> org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:100)
>       at 
> org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:74)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:675)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:367)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:223)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:198)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:114)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:73)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:464)
>       at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:326)
>       at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:91)
>       at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:172)
>       at 
> org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>       at org.apache.spark.scheduler.Task.run(Task.scala:123)
>       at 
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>       ... 42 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at java.lang.ClassLoader.findClass(ClassLoader.java:530)
>       at 
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at 
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
>       ... 47 more
> 23/02/09 16:53:44 ERROR Executor: Exception in task 0.0 in stage 40.0 (TID 78)
> org.apache.hudi.exception.HoodieException: Exception when reading log file 
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:376)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:223)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:198)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:114)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:73)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:464)
>       at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:326)
>       at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:91)
>       at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:172)
>       at 
> org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>       at org.apache.spark.scheduler.Task.run(Task.scala:123)
>       at 
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: 
> 7e51db6-6033-4794-ac59-44a930424b2b
> Serialization trace:
> orderingVal (org.apache.hudi.common.model.DeleteRecord)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>       at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
>       at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
>       at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
>       at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:391)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:302)
>       at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
>       at 
> org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:100)
>       at 
> org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:74)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:675)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:367)
>       ... 28 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>       ... 42 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at java.lang.ClassLoader.findClass(ClassLoader.java:530)
>       at 
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at 
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
>       ... 47 more
> 23/02/09 16:53:44 WARN TaskSetManager: Lost task 0.0 in stage 40.0 (TID 78, 
> localhost, executor driver): org.apache.hudi.exception.HoodieException: 
> Exception when reading log file 
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:376)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:223)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:198)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:114)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:73)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:464)
>       at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:326)
>       at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:91)
>       at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:172)
>       at 
> org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>       at org.apache.spark.scheduler.Task.run(Task.scala:123)
>       at 
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: 
> 7e51db6-6033-4794-ac59-44a930424b2b
> Serialization trace:
> orderingVal (org.apache.hudi.common.model.DeleteRecord)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>       at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
>       at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
>       at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
>       at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:391)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:302)
>       at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
>       at 
> org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:100)
>       at 
> org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:74)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:675)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:367)
>       ... 28 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>       ... 42 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at java.lang.ClassLoader.findClass(ClassLoader.java:530)
>       at 
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at 
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
>       ... 47 more
> 23/02/09 16:53:44 ERROR TaskSetManager: Task 0 in stage 40.0 failed 1 times; 
> aborting job
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in 
> stage 40.0 failed 1 times, most recent failure: Lost task 0.0 in stage 40.0 
> (TID 78, localhost, executor driver): 
> org.apache.hudi.exception.HoodieException: Exception when reading log file
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:376)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:223)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:198)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:114)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:73)
>       at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:464)
>       at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:326)
>       at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:91)
>       at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:172)
>       at 
> org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>       at org.apache.spark.scheduler.Task.run(Task.scala:123)
>       at 
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>       at java.lang.Thread.run(Thread.java:748)
> Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: 
> 7e51db6-6033-4794-ac59-44a930424b2b
> Serialization trace:
> orderingVal (org.apache.hudi.common.model.DeleteRecord)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>       at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
>       at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
>       at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
>       at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:391)
>       at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:302)
>       at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
>       at 
> org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:100)
>       at 
> org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:74)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106)
>       at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:675)
>       at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:367)
>       ... 28 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:348)
>       at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>       ... 42 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>       at java.lang.ClassLoader.findClass(ClassLoader.java:530)
>       at 
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at 
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
>       ... 47 more
> Driver stacktrace:
>   at 
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1925)
>   at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1913)
>   at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1912)
>   at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
>   at 
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1912)
>   at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:948)
>   at 
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:948)
>   at scala.Option.foreach(Option.scala:257)
>   at 
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:948)
>   at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2146)
>   at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2095)
>   at 
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2084)
>   at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>   at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:759)
>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
>   at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:365)
>   at 
> org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
>   at 
> org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3389)
>   at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2550)
>   at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2550)
>   at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3370)
>   at 
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
>   at 
> org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withAction(Dataset.scala:3369)
>   at org.apache.spark.sql.Dataset.head(Dataset.scala:2550)
>   at org.apache.spark.sql.Dataset.take(Dataset.scala:2764)
>   at org.apache.spark.sql.Dataset.getRows(Dataset.scala:254)
>   at org.apache.spark.sql.Dataset.showString(Dataset.scala:291)
>   at org.apache.spark.sql.Dataset.show(Dataset.scala:753)
>   ... 61 elided
> Caused by: org.apache.hudi.exception.HoodieException: Exception when reading 
> log file
>   at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:376)
>   at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternal(AbstractHoodieLogRecordReader.java:223)
>   at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:198)
>   at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:114)
>   at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:73)
>   at 
> org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:464)
>   at org.apache.hudi.LogFileIterator$.scanLog(Iterators.scala:326)
>   at org.apache.hudi.LogFileIterator.<init>(Iterators.scala:91)
>   at org.apache.hudi.RecordMergingFileIterator.<init>(Iterators.scala:172)
>   at 
> org.apache.hudi.HoodieMergeOnReadRDD.compute(HoodieMergeOnReadRDD.scala:100)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
>   at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>   at org.apache.spark.scheduler.Task.run(Task.scala:123)
>   at 
> org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
>   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>   at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748)
> Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: 
> 7e51db6-6033-4794-ac59-44a930424b2b
> Serialization trace:
> orderingVal (org.apache.hudi.common.model.DeleteRecord)
>   at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
>   at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>   at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
>   at 
> com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
>   at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:543)
>   at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:731)
>   at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:391)
>   at 
> com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.read(DefaultArraySerializers.java:302)
>   at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
>   at 
> org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:100)
>   at 
> org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:74)
>   at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:106)
>   at 
> org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:91)
>   at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.processQueuedBlocksForInstant(AbstractHoodieLogRecordReader.java:675)
>   at 
> org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scanInternalV1(AbstractHoodieLogRecordReader.java:367)
>   ... 28 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>   at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>   at java.lang.Class.forName0(Native Method)
>   at java.lang.Class.forName(Class.java:348)
>   at 
> com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>   ... 42 more
> Caused by: java.lang.ClassNotFoundException: 
> 7e51db6-6033-4794-ac59-44a930424b2b
>   at java.lang.ClassLoader.findClass(ClassLoader.java:530)
>   at 
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>   at 
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>   at 
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
>   ... 47 more {code}
> Run book to reproduce: 
> [https://gist.github.com/nsivabalan/b45ebc6cb64ac1d1b45cf4e6ef6d6482]
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to