[ 
https://issues.apache.org/jira/browse/HUDI-7839?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17853545#comment-17853545
 ] 

Vova Kolmakov commented on HUDI-7839:
-------------------------------------

Fixed via master branch: 9f9064761bac766cc7884027432568c06817ddd7

> Can not find props file when using HoodieDeltaStreamer with Hudi 0.14.1
> -----------------------------------------------------------------------
>
>                 Key: HUDI-7839
>                 URL: https://issues.apache.org/jira/browse/HUDI-7839
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: Xiaoxuan Li
>            Assignee: Vova Kolmakov
>            Priority: Major
>
> When use HoodieDeltaStreamer with Hudi 0.14.1, the following error was throw
> {noformat}
> Cannot read properties from dfs from file 
> file:/mnt1/yarn/usercache/hadoop/appcache/application_1717399456895_0009/container_1717399456895_0009_02_000001/src/test/resources/streamer-config/dfs-source.properties{noformat}
>  
> It works fine on Hudi 0.14.0. It might related to a new change bring in 
> 0.14.1 -> [https://github.com/apache/hudi/pull/9913]
>  
> error log:
> {code:java}
> 24/06/06 22:42:09 INFO Client:client token: N/Adiagnostics: User class threw 
> exception: org.apache.hudi.exception.HoodieIOException: Cannot read 
> properties from dfs from file 
> file:/mnt1/yarn/usercache/hadoop/appcache/application_1717399456895_0009/container_1717399456895_0009_02_000001/src/test/resources/streamer-config/dfs-source.propertiesat
>  
> org.apache.hudi.common.config.DFSPropertiesConfiguration.addPropsFromFile(DFSPropertiesConfiguration.java:166)at
>  
> org.apache.hudi.common.config.DFSPropertiesConfiguration.<init>(DFSPropertiesConfiguration.java:85)at
>  org.apache.hudi.utilities.UtilHelpers.readConfig(UtilHelpers.java:232)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer$Config.getProps(HoodieStreamer.java:437)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.getDeducedSchemaProvider(StreamSync.java:656)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.fetchNextBatchFromSource(StreamSync.java:632)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.fetchFromSourceAndPrepareRecords(StreamSync.java:525)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.readFromSource(StreamSync.java:498)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.syncOnce(StreamSync.java:404)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer$StreamSyncService.ingestOnce(HoodieStreamer.java:850)at
>  
> org.apache.hudi.utilities.ingestion.HoodieIngestionService.startIngestion(HoodieIngestionService.java:72)at
>  org.apache.hudi.common.util.Option.ifPresent(Option.java:97)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer.sync(HoodieStreamer.java:207)at
>  
> org.apache.hudi.utilities.streamer.HoodieStreamer.main(HoodieStreamer.java:592)at
>  java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method)at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)at
>  
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at
>  java.base/java.lang.reflect.Method.invoke(Method.java:568)at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:741)Caused
>  by: java.io.FileNotFoundException: File 
> file:/mnt1/yarn/usercache/hadoop/appcache/application_1717399456895_0009/container_1717399456895_0009_02_000001/src/test/resources/streamer-config/dfs-source.properties
>  does not existat 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:968)at
>  
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1289)at
>  
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:958)at
>  
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:472)at
>  
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:188)at
>  org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:581)at 
> org.apache.hadoop.fs.FileSystem.open(FileSystem.java:1004)at 
> org.apache.hudi.common.config.DFSPropertiesConfiguration.addPropsFromFile(DFSPropertiesConfiguration.java:161)...
>  18 more
> ApplicationMaster host: ip-172-31-75-55.ec2.internalApplicationMaster RPC 
> port: 43905queue: defaultstart time: 1717713711465final status: 
> FAILEDtracking URL: 
> http://ip-172-31-69-122.ec2.internal:20888/proxy/application_1717399456895_0009/user:
>  hadoop24/06/06 22:42:09 ERROR Client: Application diagnostics message: User 
> class threw exception: org.apache.hudi.exception.HoodieIOException: Cannot 
> read properties from dfs from file 
> file:/mnt1/yarn/usercache/hadoop/appcache/application_1717399456895_0009/container_1717399456895_0009_02_000001/src/test/resources/streamer-config/dfs-source.propertiesat
>  
> org.apache.hudi.common.config.DFSPropertiesConfiguration.addPropsFromFile(DFSPropertiesConfiguration.java:166)at
>  
> org.apache.hudi.common.config.DFSPropertiesConfiguration.<init>(DFSPropertiesConfiguration.java:85)at
>  org.apache.hudi.utilities.UtilHelpers.readConfig(UtilHelpers.java:232)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer$Config.getProps(HoodieStreamer.java:437)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.getDeducedSchemaProvider(StreamSync.java:656)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.fetchNextBatchFromSource(StreamSync.java:632)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.fetchFromSourceAndPrepareRecords(StreamSync.java:525)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.readFromSource(StreamSync.java:498)at
>  
> org.apache.hudi.utilities.streamer.StreamSync.syncOnce(StreamSync.java:404)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer$StreamSyncService.ingestOnce(HoodieStreamer.java:850)at
>  
> org.apache.hudi.utilities.ingestion.HoodieIngestionService.startIngestion(HoodieIngestionService.java:72)at
>  org.apache.hudi.common.util.Option.ifPresent(Option.java:97)at 
> org.apache.hudi.utilities.streamer.HoodieStreamer.sync(HoodieStreamer.java:207)at
>  
> org.apache.hudi.utilities.streamer.HoodieStreamer.main(HoodieStreamer.java:592)at
>  java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method)at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)at
>  
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at
>  java.base/java.lang.reflect.Method.invoke(Method.java:568)at 
> org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:741)Caused
>  by: java.io.FileNotFoundException: File 
> file:/mnt1/yarn/usercache/hadoop/appcache/application_1717399456895_0009/container_1717399456895_0009_02_000001/src/test/resources/streamer-config/dfs-source.properties
>  does not existat 
> org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:968)at
>  
> org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1289)at
>  
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:958)at
>  
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:472)at
>  
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:188)at
>  org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:581)at 
> org.apache.hadoop.fs.FileSystem.open(FileSystem.java:1004)at 
> org.apache.hudi.common.config.DFSPropertiesConfiguration.addPropsFromFile(DFSPropertiesConfiguration.java:161)...
>  18 more
> Exception in thread "main" org.apache.spark.SparkException: Application 
> application_1717399456895_0009 finished with failed statusat 
> org.apache.spark.deploy.yarn.Client.run(Client.scala:1321)at 
> org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1754)at
>  
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1075)at
>  org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)at 
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)at 
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1167)at
>  org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1176)at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)24/06/06 22:42:09 
> INFO ShutdownHookManager: Shutdown hook called{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to