wannaberich commented on issue #877: java.lang.NoSuchMethodError: 
io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
URL: https://github.com/apache/incubator-hudi/issues/877#issuecomment-528433568
 
 
   I decided to integrate somehow apache uber master branch (0.5.0-SNAPSHOT). 
As I wrote that bundles throw some exceptions when it's in %SPARK_HOME/jars 
folder that's why I packaged every module independently and put all modules jar 
into %SPARK_HOME/jars.
   
   What I see and it works somehow (at least it run and I don't see exceptions 
with imports and etc...) I'm trying integrate Apache Hudi to HDP 3.1 
(HDFS-3.1.*, spark-2.3.*, hive-3.*)
   
   When I run a job from zeppelin I got these exceptions.
   
   ```
   org.apache.hudi.exception.HoodieInsertException: Failed to insert for commit 
time 20190905124111
    at org.apache.hudi.HoodieWriteClient.insert(HoodieWriteClient.java:228)
    at 
org.apache.hudi.DataSourceUtils.doWriteOperation(DataSourceUtils.java:178)
    at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:143)
    at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
    at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
    at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
    at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
    at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
    at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:656)
    at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
    ... 41 elided
   Caused by: java.lang.NoSuchMethodError: 
org.apache.hadoop.fs.FSDataOutputStream: method <init>(Ljava/io/OutputStream;)V 
not found
    at 
org.apache.hudi.common.io.storage.SizeAwareFSDataOutputStream.<init>(SizeAwareFSDataOutputStream.java:46)
    at 
org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.wrapOutputStream(HoodieWrapperFileSystem.java:160)
    at 
org.apache.hudi.common.io.storage.HoodieWrapperFileSystem.create(HoodieWrapperFileSystem.java:168)
    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.createFileInPath(HoodieActiveTimeline.java:386)
    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.createFileInMetaPath(HoodieActiveTimeline.java:371)
    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.saveToInflight(HoodieActiveTimeline.java:359)
    at 
org.apache.hudi.HoodieWriteClient.saveWorkloadProfileMetadataToInflight(HoodieWriteClient.java:417)
    at 
org.apache.hudi.HoodieWriteClient.upsertRecordsInternal(HoodieWriteClient.java:440)
    at org.apache.hudi.HoodieWriteClient.insert(HoodieWriteClient.java:223)
    ... 63 more
   ```
   
   From the exceptions, I understand that the problems come from HDFS version 
(3.1.*). So does it mean that it's impossible to run apache hudi with HDFS 
above 2.7.* ?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to