[GitHub] [hudi] xccui commented on issue #8325: [SUPPORT] spark read hudi error: Unable to instantiate HFileBootstrapIndex

2023-04-11 Thread via GitHub


xccui commented on issue #8325:
URL: https://github.com/apache/hudi/issues/8325#issuecomment-1504472901

   Got some time today to take a closer look at the errors. 
`HFileBootstrapIndex` needs to access some remote data during initialization. 
There should be some connection issues (e.g. file system closed or connection 
interrupted due to some reason) causing the initialization to fail. Shouldn't 
be a compatibility problem.
   
   Maybe we could move some logic out of the constructor.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [hudi] xccui commented on issue #8325: [SUPPORT] spark read hudi error: Unable to instantiate HFileBootstrapIndex

2023-04-04 Thread via GitHub


xccui commented on issue #8325:
URL: https://github.com/apache/hudi/issues/8325#issuecomment-1496179790

   > > I hit the same exception in a Flink writer job. It happened when the job 
was trying to recover from a failure.
   > > Hudi version: 0.13.0 Flink version: 1.16.1
   > 
   > have you try to reproduce the error? does it happen when you restart the 
job manually? If it does reproduce, can you share your configs and table fields 
type so we can try to reproduce locally and do some debugging.
   
   It happened when a job tried to recover from a failure. Here is the full 
stack trace.
   
   ```
   2023-04-03 18:25:00 [stream_write: response_insertion_ids (5/5)#4] ERROR 
org.apache.hudi.io.HoodieAppendHandle[] - Error writing 
record HoodieRecord{key=HoodieKey { 
recordKey=platformId:120,requestId:cfuleqeijb4qrau0tl4g 
partitionPath=dt=2023-02-28/hr=01}, currentLocation='HoodieRecordLocation 
{instantTime=U, fileId=null}', newLocation='null'}
   org.apache.hudi.exception.HoodieException: Unable to instantiate class 
org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex
at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:79) 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.bootstrap.index.BootstrapIndex.getBootstrapIndex(BootstrapIndex.java:163)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.AbstractTableFileSystemView.init(AbstractTableFileSystemView.java:114)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.init(HoodieTableFileSystemView.java:113)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.(HoodieTableFileSystemView.java:107)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.FileSystemViewManager.createInMemoryFileSystemView(FileSystemViewManager.java:177)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.FileSystemViewManager.lambda$createViewManager$5fcdabfe$1(FileSystemViewManager.java:272)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.table.view.FileSystemViewManager.lambda$getFileSystemView$1(FileSystemViewManager.java:115)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(Unknown 
Source) ~[?:?]
at 
org.apache.hudi.common.table.view.FileSystemViewManager.getFileSystemView(FileSystemViewManager.java:114)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at org.apache.hudi.table.HoodieTable.getSliceView(HoodieTable.java:316) 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:162) 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.io.HoodieAppendHandle.doWrite(HoodieAppendHandle.java:486) 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.io.HoodieWriteHandle.write(HoodieWriteHandle.java:175) 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.execution.ExplicitWriteHandler.consume(ExplicitWriteHandler.java:49)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.execution.ExplicitWriteHandler.consume(ExplicitWriteHandler.java:35)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.common.util.queue.SimpleExecutor.execute(SimpleExecutor.java:67)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.execution.FlinkLazyInsertIterable.computeNext(FlinkLazyInsertIterable.java:64)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.execution.FlinkLazyInsertIterable.computeNext(FlinkLazyInsertIterable.java:43)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at 
org.apache.hudi.client.utils.LazyIterableIterator.next(LazyIterableIterator.java:119)
 
~[blob_p-17a9236a425d801a5f243ade3abda3e034663e62-8dfab769092e9e11eec10f6efa74d93b:?]
at java.util.Iterator.forEachRemaining(Unknown Source) ~[?:?]
at 

[GitHub] [hudi] xccui commented on issue #8325: [SUPPORT] spark read hudi error: Unable to instantiate HFileBootstrapIndex

2023-04-03 Thread via GitHub


xccui commented on issue #8325:
URL: https://github.com/apache/hudi/issues/8325#issuecomment-1494840392

   I hit the same exception in a Flink writer job. It happened when the job was 
trying to recover from a failure.
   
   Hudi version: 0.13.0
   Flink version: 1.16.1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org