Nishith Agarwal created HUDI-2026:
-------------------------------------

             Summary: Add documentation for GlobalDeleteKeyGenerator
                 Key: HUDI-2026
                 URL: https://issues.apache.org/jira/browse/HUDI-2026
             Project: Apache Hudi
          Issue Type: Sub-task
            Reporter: Nishith Agarwal
            Assignee: sivabalan narayanan


[https://github.com/apache/hudi/issues/3008]

 
{code:java}
 - should hard delete records from hudi table with hive sync *** FAILED *** (24 
seconds, 49 milliseconds)
Cause: java.lang.NoSuchMethodException: 
org.apache.hudi.keygen.GlobalDeleteKeyGenerator.<init>()
[scalatest]   at java.lang.Class.getConstructor0(Class.java:3110)
[scalatest]   at java.lang.Class.newInstance(Class.java:412)
[scalatest]   at 
org.apache.hudi.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:98)
[scalatest]   at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:69)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$.org$apache$hudi$HoodieSparkSqlWriter$$syncHive(HoodieSparkSqlWriter.scala:391)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:440)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$$anonfun$metaSync$2.apply(HoodieSparkSqlWriter.scala:436)
[scalatest]   at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:436)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:497)
[scalatest]   at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:222)
[scalatest]   at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:145)
[scalatest]   at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
[scalatest]   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
[scalatest]   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
[scalatest]   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
[scalatest]   at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
[scalatest]   at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
[scalatest]   at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
[scalatest]   at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[scalatest]   at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
[scalatest]   at org.apach
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to