hudi-bot opened a new issue, #15137:
URL: https://github.com/apache/hudi/issues/15137

   I have updated the hudi column partition from 'year,month' to 'year. Then I 
ran the process in overwrite mode. The process executed successfully and hudi 
table got created. 
   
   However, when the process got triggered in 'append' mode, I started getting 
the error mentioned below:
   
   '
   
   Task 0 in stage 32.0 failed 4 times; aborting job java.lang.Exception: Job 
aborted due to stage failure: Task 0 in stage 32.0 failed 4 times, most recent 
failure: Lost task 0.3 in stage 32.0 (TID 1207, ip-10-73-110-184.ec2.internal, 
executor 6): org.apache.hudi.exception.HoodieUpsertException: Error upserting 
bucketType UPDATE for partition :0 at 
org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:305)
   
   '
   
   Then I reverted the partition columns back to 'year,month' but still got the 
same error. But, when I am writing data in different folder in 'append' mode, 
the script ran fine and I could see the Hudi table. 
   
   In short, the process is not working when I am trying to append the data in 
the same path. Can you please look into this. This is critical to us because 
the jobs are stuck.
   
   ## JIRA info
   
   - Link: https://issues.apache.org/jira/browse/HUDI-3915
   - Type: Bug
   
   
   ---
   
   
   ## Comments
   
   14/Sep/22 21:33;alexey.kudinkin;[~ngupta2206] can you please provide the 
full stack-trace? 
   
   Also what's Spark, Hudi versions are you using?;;;


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to