hudi-bot opened a new issue, #14644: URL: https://github.com/apache/hudi/issues/14644
I am using Hudi 0.5.2 with ScalaTest 2.2.5 framework on Scala 2.11 but seeing the following error when saving but I can run the main code in the Spark cluster with no errors. Is there a compatibility issue with ScalaTest and Hudi version? If so which version should I use? I tried ScalaTest 3.0.0 as well but still the same issue[scalatest] Cause: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hudi.avro.model.HoodieCleanerPlan [scalatest] at org.apache.hudi.table.HoodieCopyOnWriteTable.scheduleClean(HoodieCopyOnWriteTable.java:295) [scalatest] at org.apache.hudi.client.HoodieCleanClient.scheduleClean(HoodieCleanClient.java:114) [scalatest] at org.apache.hudi.client.HoodieCleanClient.clean(HoodieCleanClient.java:91) [scalatest] at org.apache.hudi.client.HoodieWriteClient.clean(HoodieWriteClient.java:835) [scalatest] at org.apache.hudi.client.HoodieWriteClient.postCommit(HoodieWriteClient.java:512) [scalatest] at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:157) [scalatest] at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:101) [scalatest] at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:92) [scalatest] at org.apache.hudi.HoodieSparkSqlWriter$.checkWriteStatus(HoodieSparkSqlWriter.scala:262) [scalatest] at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:184) [scalatest] at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91) [scalatest] at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45) [scalatest] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) [scalatest] at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) [scalatest] at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86) [scalatest] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131) [scalatest] at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127) [scalatest] at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155) [scalatest] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [scalatest] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152) [scalatest] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127) [scalatest] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80) [scalatest] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80) [scalatest] at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668) [scalatest] at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668) [scalatest] at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) [scalatest] at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) [scalatest] at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) [scalatest] at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668) [scalatest] at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276) [scalatest] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270) [scalatest] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228) ## JIRA info - Link: https://issues.apache.org/jira/browse/HUDI-1171 - Type: Bug -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
