cocoapan opened a new issue, #6410:
URL: https://github.com/apache/hudi/issues/6410

   Hi,
       When I update the hudi table using mergeInto syntax, I get an error 
message:
    assertion failed: hoodie.payload.update.condition.assignments have not set.
   <img width="1756" alt="image" 
src="https://user-images.githubusercontent.com/99819932/184908862-07a8185a-9337-4c3a-a3c8-ad06040d1c21.png";>
   
   Spark SQL:
   `merge into A as t using (select * from B) as s on (t.cola = s.colb and 
t.colb = s.colb and t.colc between ... and ... ) when not matched then insert
   `
   
   This Spark SQL code does not have a WHEN MATCHED part. After creating the 
target table, it can run successfully for the first time, and then an error is 
reported. I only add WHEN MATCHED, the task will succeed. But actually, I don't 
need WHEN MATCHED.
   
   Hudi spark guide:
   https://hudi.apache.org/docs/quick-start-guide
   <img width="983" alt="image" 
src="https://user-images.githubusercontent.com/99819932/184910269-40c2379c-3f26-4f8d-a933-7ad169c0c2b8.png";>
   
   In this guide, WHEN MATCHED is optional, but the opposite is true.
   
   I really hope the hudi community can help me confirm this issue.
   
   Thank you so much.
   
   
   **Environment Description**
   
   * Hudi version : 0.11
   
   * Spark version : 3.1.2
   
   * Hive version : -
   
   * Hadoop version : -
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) : no
   
   
   
   
   **Stacktrace**
   
   ```
   Caused by: org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: 
java.util.concurrent.ExecutionException: java.lang.AssertionError: assertion 
failed: hoodie.payload.update.condition.assignments have not set
   
   at 
org.apache.hudi.table.action.commit.HoodieMergeHelper.runMerge(HoodieMergeHelper.java:149)
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdateInternal(BaseSparkCommitActionExecutor.java:358)
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdate(BaseSparkCommitActionExecutor.java:349)
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:322)
   ... 29 more
   Caused by: org.apache.hudi.exception.HoodieException: 
java.util.concurrent.ExecutionException: java.lang.AssertionError: assertion 
failed: hoodie.payload.update.condition.assignments have not set
   
   at 
org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:160)
   at 
org.apache.hudi.table.action.commit.HoodieMergeHelper.runMerge(HoodieMergeHelper.java:147)
   ... 32 more
   Caused by: java.util.concurrent.ExecutionException: 
java.lang.AssertionError: assertion failed: 
hoodie.payload.update.condition.assignments have not set
   
   at java.util.concurrent.FutureTask.report(FutureTask.java:122)
   at java.util.concurrent.FutureTask.get(FutureTask.java:192)
   at 
org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.execute(BoundedInMemoryExecutor.java:154)
   ... 33 more
   Caused by: java.lang.AssertionError: assertion failed: 
hoodie.payload.update.condition.assignments have not set
   
   at scala.Predef$.assert(Predef.scala:223)
   at 
org.apache.spark.sql.hudi.command.payload.ExpressionPayload.processMatchedRecord(ExpressionPayload.scala:96)
   at 
org.apache.spark.sql.hudi.command.payload.ExpressionPayload.combineAndGetUpdateValue(ExpressionPayload.scala:77)
   at org.apache.hudi.io.HoodieMergeHandle.write(HoodieMergeHandle.java:328)
   at 
org.apache.hudi.table.action.commit.BaseMergeHelper$UpdateHandler.consumeOneRecord(BaseMergeHelper.java:122)
   at 
org.apache.hudi.table.action.commit.BaseMergeHelper$UpdateHandler.consumeOneRecord(BaseMergeHelper.java:112)
   at 
org.apache.hudi.common.util.queue.BoundedInMemoryQueueConsumer.consume(BoundedInMemoryQueueConsumer.java:37)
   at 
org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$273(BoundedInMemoryExecutor.java:134)
   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   ... 3 more
   
   Driver stacktrace:
   at 
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2273)
   at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2222)
   at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2221)
   at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
   at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
   at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2221)
   at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1079)
   at 
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1079)
   at scala.Option.foreach(Option.scala:407)
   at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1079)
   at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2462)
   at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2404)
   at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2393)
   at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
   at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868)
   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2293)
   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2314)
   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2333)
   at org.apache.spark.SparkContext.runJob(SparkContext.scala:2358)
   at org.apache.spark.rdd.RDD.count(RDD.scala:1253)
   at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:646)
   at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:314)
   at 
org.apache.spark.sql.hudi.command.MergeIntoHoodieTableCommand.executeInsertOnly(MergeIntoHoodieTableCommand.scala:311)
   at 
org.apache.spark.sql.hudi.command.MergeIntoHoodieTableCommand.run(MergeIntoHoodieTableCommand.scala:156)
   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
   at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
   at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
   at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3689)
   at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:105)
   at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:172)
   at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:92)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:801)
   at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
   at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3687)
   at org.apache.spark.sql.Dataset.(Dataset.scala:228)
   at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:801)
   at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
   at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:623)
   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:801)
   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:616)
   at org.apache.livy.thriftserver.session.SqlJob.executeSql(SqlJob.java:87)
   at org.apache.livy.thriftserver.session.SqlJob.call(SqlJob.java:67)
   at org.apache.livy.thriftserver.session.SqlJob.call(SqlJob.java:35)
   at org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:82)
   at org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:34)
   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieUpsertException: Error upserting 
bucketType UPDATE for partition :0
   
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:329)
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleInsertPartition(BaseSparkCommitActionExecutor.java:335)
   at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.lambda$mapPartitionsAsRDD$a3ab3c4$1(BaseSparkCommitActionExecutor.java:246)
   at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsWithIndex$1(JavaRDDLike.scala:102)
   at 
org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsWithIndex$1$adapted(JavaRDDLike.scala:102)
   at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:915)
   at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:915)
   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
   at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
   at org.apache.spark.rdd.RDD.$anonfun$getOrCompute$1(RDD.scala:386)
   at 
org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1440)
   at 
org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1350)
   at 
org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1414)
   at 
org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1237)
   at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:384)
   at org.apache.spark.rdd.RDD.iterator(RDD.scala:335)
   at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373)
   at org.apache.spark.rdd.RDD.iterator(RDD.scala:337)
   at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
   at org.apache.spark.scheduler.Task.run(Task.scala:131)
   at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1453)
   at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
   ... 3 more
   Caused by: org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: 
java.util.concurrent.ExecutionException: java.lang.AssertionError: assertion 
failed: hoodie.payload.update.condition.assignments have not set
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to