hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870909915
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870909616
## CI report:
* 8e87b6af48f9d7597c6955532d5ee7e61125813b Azure:
[SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
xuzifu666 closed pull request #10297: [HUDI-7208] Do writing stage should
shutdown with error when insert failed to reduce user execute time and show
error details
URL: https://github.com/apache/hudi/pull/10297
--
This is an automated message from the Apache Git Service.
To respond to the me
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870905228
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870904981
## CI report:
* 03f4555ddfa2cc2861c4624888724d51caca28aa Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870904874
## CI report:
* 8e87b6af48f9d7597c6955532d5ee7e61125813b Azure:
[SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
xuzifu666 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437448466
##
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/commit/BaseSparkCommitActionExecutor.java:
##
@@ -292,6 +294,9 @@ protected void
setCommit
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870900238
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
xuzifu666 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437446264
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/config/HoodieWriteConfig.java:
##
@@ -765,6 +765,14 @@ public class HoodieWriteConfig extends HoodieCon
hudi-bot commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870899907
## CI report:
* 8e87b6af48f9d7597c6955532d5ee7e61125813b Azure:
[SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
[
https://issues.apache.org/jira/browse/HUDI-7249?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Danny Chen updated HUDI-7249:
-
Fix Version/s: 1.0.0
> Close compaction when using append mode
> ---
>
[
https://issues.apache.org/jira/browse/HUDI-7249?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Danny Chen closed HUDI-7249.
Resolution: Fixed
Fixed via master branch: 3b3ca961d9e960306c49624b90713a6ec91f9530
> Close compaction when
This is an automated email from the ASF dual-hosted git repository.
danny0405 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/hudi.git
The following commit(s) were added to refs/heads/master by this push:
new 3b3ca961d9e [HUDI-7249] Disable mor compaction
danny0405 merged PR #10388:
URL: https://github.com/apache/hudi/pull/10388
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: commits-unsubscr...@hudi.apac
danny0405 closed issue #10423: [SUPPORT] spark read hudi table can not ignore
append log file content
URL: https://github.com/apache/hudi/issues/10423
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
danny0405 commented on issue #10423:
URL: https://github.com/apache/hudi/issues/10423#issuecomment-1870890189
It is the code breaks the compatibility, generally you cann't use old
release for read becase the old release has no knowleadge of new features.
--
This is an automated message fr
danny0405 commented on PR #10427:
URL: https://github.com/apache/hudi/pull/10427#issuecomment-1870889209
I see related changes: https://github.com/apache/hudi/pull/10389
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
danny0405 commented on code in PR #10413:
URL: https://github.com/apache/hudi/pull/10413#discussion_r1437436842
##
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/procedures/RepairOverwriteHoodiePropsProcedure.scala:
##
@@ -57,7 +67,7 @@ class R
danny0405 commented on issue #10424:
URL: https://github.com/apache/hudi/issues/10424#issuecomment-1870885655
We do not design upgrade procedure for 1.0 yet, there is no way to do the
upgrade.
--
This is an automated message from the Apache Git Service.
To respond to the message, please l
danny0405 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437434113
##
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/commit/BaseSparkCommitActionExecutor.java:
##
@@ -292,6 +294,9 @@ protected void
setCommit
danny0405 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437434021
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/config/HoodieWriteConfig.java:
##
@@ -765,6 +765,14 @@ public class HoodieWriteConfig extends HoodieCon
danny0405 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437433114
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/config/HoodieWriteConfig.java:
##
@@ -765,6 +765,14 @@ public class HoodieWriteConfig extends HoodieCon
empcl commented on code in PR #10413:
URL: https://github.com/apache/hudi/pull/10413#discussion_r1437431464
##
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/procedures/RepairOverwriteHoodiePropsProcedure.scala:
##
@@ -57,7 +67,7 @@ class Repai
empcl commented on code in PR #10413:
URL: https://github.com/apache/hudi/pull/10413#discussion_r1437415765
##
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/procedures/RepairOverwriteHoodiePropsProcedure.scala:
##
@@ -57,7 +67,7 @@ class Repai
hudi-bot commented on PR #10389:
URL: https://github.com/apache/hudi/pull/10389#issuecomment-1870876284
## CI report:
* 248df7c04d611c5f521f309732aa21351161fa8b UNKNOWN
* daa335eb9234834ded995c1cc27480b94d2bb0e3 Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870872508
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10389:
URL: https://github.com/apache/hudi/pull/10389#issuecomment-1870872428
## CI report:
* 248df7c04d611c5f521f309732aa21351161fa8b UNKNOWN
* daa335eb9234834ded995c1cc27480b94d2bb0e3 UNKNOWN
Bot commands
@hudi-bot supports the followi
hudi-bot commented on PR #10342:
URL: https://github.com/apache/hudi/pull/10342#issuecomment-1870872297
## CI report:
* cb62ad9bed32bf3acc6f8227e5e824cb73e8f0e4 UNKNOWN
* a774c2a1efccf6012b20f0a94b44f8b3ae4cdbbe Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870868702
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10344:
URL: https://github.com/apache/hudi/pull/10344#issuecomment-1870868486
## CI report:
* d5c669fdb2b061ff6e65b42aa969be2902c033c7 Azure:
[SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10342:
URL: https://github.com/apache/hudi/pull/10342#issuecomment-1870868450
## CI report:
* cb62ad9bed32bf3acc6f8227e5e824cb73e8f0e4 UNKNOWN
* a774c2a1efccf6012b20f0a94b44f8b3ae4cdbbe Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
xuzifu666 commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870867883
> @xuzifu666 Thanks for contribution. This pr add throw exception in each
write handlers. Is it possible to check all write statuses for failures during
the commit phase?
Good adv
beyond1920 commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870865138
@xuzifu666 Thanks for contribution.
Is it possible to check all write statuses for failures during the commit
phase?
--
This is an automated message from the Apache Git Service.
To
xuzifu666 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437417679
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/io/HoodieAppendHandle.java:
##
@@ -294,6 +295,9 @@ private Option prepareRecord(HoodieRecord
hoodieRec
xuzifu666 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437324192
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/io/HoodieAppendHandle.java:
##
@@ -294,6 +295,9 @@ private Option prepareRecord(HoodieRecord
hoodieRec
beyond1920 commented on issue #10407:
URL: https://github.com/apache/hudi/issues/10407#issuecomment-1870862338
![image](https://github.co@zyclove
/apache/hudi/assets/1525333/9083eb26-71fd-4656-9c25-c0374fc7ccf2)
@zyclove Data deduplication caused by records with same primary key value
ar
hehuiyuan commented on code in PR #10388:
URL: https://github.com/apache/hudi/pull/10388#discussion_r1437409285
##
hudi-flink-datasource/hudi-flink/src/main/java/org/apache/hudi/table/HoodieTableSink.java:
##
@@ -96,6 +96,8 @@ public SinkRuntimeProvider getSinkRuntimeProvider(Co
wanghuan2054 commented on issue #10424:
URL: https://github.com/apache/hudi/issues/10424#issuecomment-1870853563
> Are you using 0.14.0 or 1.0, we have this check in 1.0. since 1.0, the
commit metadata file naming changes to:
`${start_time}_${completion_time}.${action}`
I use release
wanghuan2054 commented on issue #10404:
URL: https://github.com/apache/hudi/issues/10404#issuecomment-1870852423
> Did you use the 1.0 release? For 0.14.0, the completion is fetched through
the metadata file modification time.
I use release-1.0.0-beta1 branch
--
This is an automate
hudi-bot commented on PR #10427:
URL: https://github.com/apache/hudi/pull/10427#issuecomment-1870847324
## CI report:
* 2f65fd8901d85bc013370abfea7921a6ce88386d Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870847172
## CI report:
* b6ebd945a7d980ee5044edcd5b4f8390f6bd8adf Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10427:
URL: https://github.com/apache/hudi/pull/10427#issuecomment-1870843941
## CI report:
* 2f65fd8901d85bc013370abfea7921a6ce88386d UNKNOWN
Bot commands
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run th
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870843925
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870843762
## CI report:
* b6ebd945a7d980ee5044edcd5b4f8390f6bd8adf Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
young138120 commented on issue #10423:
URL: https://github.com/apache/hudi/issues/10423#issuecomment-1870841132
> You might need to use higher hudi release for reading.
is that only way ?
I can not upgrade the release version
--
This is an automated message from the Apache Git
[
https://issues.apache.org/jira/browse/HUDI-7267?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
ASF GitHub Bot updated HUDI-7267:
-
Labels: pull-request-available (was: )
> csi will cause data loss during sql query
>
hudi-bot commented on PR #10426:
URL: https://github.com/apache/hudi/pull/10426#issuecomment-1870840605
## CI report:
* d51cd5d57bc3150b23f083be4f3b765fa893c877 UNKNOWN
Bot commands
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run th
KnightChess opened a new pull request, #10427:
URL: https://github.com/apache/hudi/pull/10427
from the picture, csi will use parquet chunk block meta calculate min/max
value, and save it to mdt col stat. For complex cols, such as *info
array>* , parquet meta will contain only
`info.array.n
hudi-bot commented on PR #10342:
URL: https://github.com/apache/hudi/pull/10342#issuecomment-1870840422
## CI report:
* cb62ad9bed32bf3acc6f8227e5e824cb73e8f0e4 UNKNOWN
* a774c2a1efccf6012b20f0a94b44f8b3ae4cdbbe Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10278:
URL: https://github.com/apache/hudi/pull/10278#issuecomment-1870840351
## CI report:
* d98b47625ecada36364aa02aa1496dafd330c6a9 UNKNOWN
* ab0b2127349325a3c939fe65da9d8caaac0da018 UNKNOWN
* 24c73cbc59102945de2fbb9b9c1f1d0f3234c20a Azure:
[FAIL
KnightChess created HUDI-7267:
-
Summary: csi will cause data loss during sql query
Key: HUDI-7267
URL: https://issues.apache.org/jira/browse/HUDI-7267
Project: Apache Hudi
Issue Type: Bug
bvaradar commented on code in PR #10414:
URL: https://github.com/apache/hudi/pull/10414#discussion_r1437380985
##
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestHoodieTableValuedFunction.scala:
##
@@ -192,6 +192,69 @@ class TestHoodieTableValuedFun
beyond1920 commented on issue #10404:
URL: https://github.com/apache/hudi/issues/10404#issuecomment-1870820501
@wanghuan2054 The bundle jar is not packaged on 0.14 branch.
You could cherrypick the PR to 0.14.1 branch instead of master branch, then
build the bundle jar.
--
This is an a
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870820483
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* 84b89a52d9f2abf7720815e20756e46fc7bf24c5 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
[
https://issues.apache.org/jira/browse/HUDI-7265?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
ASF GitHub Bot updated HUDI-7265:
-
Labels: pull-request-available (was: )
> Support schema evolution by Flink SQL using HoodieHiveCa
beyond1920 opened a new pull request, #10426:
URL: https://github.com/apache/hudi/pull/10426
### Change Logs
Since Flink 1.17, Flink SQL support more advanced alter table syntax.
```
-- add a new column
ALTER TABLE MyTable ADD category_id STRING COMMENT 'identifier of the
danny0405 commented on code in PR #10413:
URL: https://github.com/apache/hudi/pull/10413#discussion_r1437370132
##
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/command/procedures/RepairOverwriteHoodiePropsProcedure.scala:
##
@@ -57,7 +67,7 @@ class R
danny0405 closed issue #10404: [SUPPORT] hoodie.properties file seems invalid.
Please check for left over .updated files if any, manually copy it to
hoodie.properties and retry
URL: https://github.com/apache/hudi/issues/10404
--
This is an automated message from the Apache Git Service.
To re
danny0405 commented on issue #10424:
URL: https://github.com/apache/hudi/issues/10424#issuecomment-1870812733
Are you using 0.14.0 or 1.0, we have this check in 1.0. since 1.0, the
commit metadata file naming changes to:
`${start_time}_${completion_time}.${action}`
--
This is an automa
danny0405 commented on PR #10412:
URL: https://github.com/apache/hudi/pull/10412#issuecomment-1870810416
We have re-designed the lock acquisition since 1.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
danny0405 commented on issue #10423:
URL: https://github.com/apache/hudi/issues/10423#issuecomment-1870809808
You might need to use higher hudi release for reading.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
danny0405 commented on issue #10404:
URL: https://github.com/apache/hudi/issues/10404#issuecomment-1870806823
Did you use the 1.0 release? For 0.14.0, the completion is fetched through
the metadata file modification time.
--
This is an automated message from the Apache Git Service.
To res
xushiyan opened a new pull request, #10425:
URL: https://github.com/apache/hudi/pull/10425
### Change Logs
Add a new blog.
### Impact
NA
### Risk level
None.
### Documentation Update
NA
### Contributor's checklist
- [ ] Read throu
bhat-vinay commented on code in PR #10414:
URL: https://github.com/apache/hudi/pull/10414#discussion_r1437361618
##
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/hudi/TestHoodieTableValuedFunction.scala:
##
@@ -192,6 +192,69 @@ class TestHoodieTableValuedF
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870798404
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* d419efd8ffdafbc1076c25884d70cc012612d5a1 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10344:
URL: https://github.com/apache/hudi/pull/10344#issuecomment-1870798265
## CI report:
* 50121f9c130e543642c114d8b96b72e88273d730 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10278:
URL: https://github.com/apache/hudi/pull/10278#issuecomment-1870798178
## CI report:
* d98b47625ecada36364aa02aa1496dafd330c6a9 UNKNOWN
* ab0b2127349325a3c939fe65da9d8caaac0da018 UNKNOWN
* 4709de3c2d239f5b5a974d4a637d153df0a5e215 Azure:
[FAIL
hudi-bot commented on PR #10342:
URL: https://github.com/apache/hudi/pull/10342#issuecomment-1870798249
## CI report:
* cb62ad9bed32bf3acc6f8227e5e824cb73e8f0e4 UNKNOWN
* 3f0829263192c35ae636e707106a97d7c0142ff7 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870798279
## CI report:
* b6ebd945a7d980ee5044edcd5b4f8390f6bd8adf Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
the-other-tim-brown commented on code in PR #10238:
URL: https://github.com/apache/hudi/pull/10238#discussion_r1437357185
##
hudi-common/src/main/java/org/apache/hudi/metadata/HoodieTableMetadataUtil.java:
##
@@ -652,7 +653,7 @@ public static HoodieData
convertMetadataToColumnS
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870795679
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* d419efd8ffdafbc1076c25884d70cc012612d5a1 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870795572
## CI report:
* b6ebd945a7d980ee5044edcd5b4f8390f6bd8adf Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10344:
URL: https://github.com/apache/hudi/pull/10344#issuecomment-1870795546
## CI report:
* 50121f9c130e543642c114d8b96b72e88273d730 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
hudi-bot commented on PR #10342:
URL: https://github.com/apache/hudi/pull/10342#issuecomment-1870795525
## CI report:
* cb62ad9bed32bf3acc6f8227e5e824cb73e8f0e4 UNKNOWN
* 3f0829263192c35ae636e707106a97d7c0142ff7 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10278:
URL: https://github.com/apache/hudi/pull/10278#issuecomment-1870795459
## CI report:
* d98b47625ecada36364aa02aa1496dafd330c6a9 UNKNOWN
* ab0b2127349325a3c939fe65da9d8caaac0da018 UNKNOWN
* 4709de3c2d239f5b5a974d4a637d153df0a5e215 Azure:
[FAIL
hudi-bot commented on PR #10278:
URL: https://github.com/apache/hudi/pull/10278#issuecomment-1870792527
## CI report:
* d98b47625ecada36364aa02aa1496dafd330c6a9 UNKNOWN
* ab0b2127349325a3c939fe65da9d8caaac0da018 UNKNOWN
* b68e198da7363339b4389a63d2d7ba181bbf9d54 Azure:
[FAIL
majian1998 closed pull request #10419: [HUDI-7264] In a Query-Only Spark
Session, the Latest Visible Commit Is Not Updated
URL: https://github.com/apache/hudi/pull/10419
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
majian1998 commented on PR #10419:
URL: https://github.com/apache/hudi/pull/10419#issuecomment-1870783894
@bvaradar:
If the relation is recreated, the file index will be forcefully refreshed
(the object is rebuilt).
In fact, this modification has, on one hand, indeed increased the cos
zhangjw123321 commented on issue #10418:
URL: https://github.com/apache/hudi/issues/10418#issuecomment-1870779093
First of all, thank you very much for your reply.
Advanced configuration is introduced here, after I configure this parameter
is not effective, I think the normal conf
the-other-tim-brown commented on PR #10345:
URL: https://github.com/apache/hudi/pull/10345#issuecomment-1870778723
> Can this issue be fixed and merged into the 0.14.1 branch as soon as
possible? @nsivabalan @danny0405 @the-other-tim-brown
It is up to the project committers. I've re-t
wanghuan2054 commented on issue #10404:
URL: https://github.com/apache/hudi/issues/10404#issuecomment-187074
> Create a [JIRA](https://issues.apache.org/jira/browse/HUDI-7262) to track
the issue
thx , Fix the bugs mentioned in your PR , We have met another issue ,As
follows http
zhangjw123321 commented on issue #10418:
URL: https://github.com/apache/hudi/issues/10418#issuecomment-1870777405
![image](https://github.com/apache/hudi/assets/154970920/b622352a-4680-471d-b825-34535cc0a126)
--
This is an automated message from the Apache Git Service.
To respond to t
the-other-tim-brown commented on code in PR #10344:
URL: https://github.com/apache/hudi/pull/10344#discussion_r1437344215
##
hudi-common/src/main/java/org/apache/hudi/common/util/collection/ExternalSpillableMap.java:
##
@@ -78,41 +78,49 @@ public class ExternalSpillableMap keySi
wanghuan2054 opened a new issue, #10424:
URL: https://github.com/apache/hudi/issues/10424
**Environment Description**
* Hudi version : 0.14
* Spark version : 3.0
* Hive version : 3.1.2
* Hadoop version : 3.2.1
* Storage (HDFS/S3/GCS..) : HDFS
* Runnin
zhangjw123321 commented on issue #10418:
URL: https://github.com/apache/hudi/issues/10418#issuecomment-1870775372
This dataset is imported by mysql into the hive table,Use sqoop-m 1。
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on t
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870773303
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* d419efd8ffdafbc1076c25884d70cc012612d5a1 Azure:
[FAILURE](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
the-other-tim-brown commented on code in PR #10344:
URL: https://github.com/apache/hudi/pull/10344#discussion_r1437340978
##
hudi-common/src/main/java/org/apache/hudi/common/util/collection/ExternalSpillableMap.java:
##
@@ -78,41 +78,49 @@ public class ExternalSpillableMap keySi
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870770019
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* 7d6e12d4c5e1573d1b42b674a95890edfbb6b7a2 Azure:
[PENDING](https://dev.azure.com/apache-hudi-ci-org/785b6ef4
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870767047
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
* 7d6e12d4c5e1573d1b42b674a95890edfbb6b7a2 UNKNOWN
Bot commands
@hudi-bot supports the followi
xuzifu666 commented on PR #10297:
URL: https://github.com/apache/hudi/pull/10297#issuecomment-1870747269
@bvaradar PTAL,this pr aim to fix when some condition such as write data
error,but writing stage not error out,this would confused user such as write
data schema is not consistent with t
xuzifu666 commented on code in PR #10297:
URL: https://github.com/apache/hudi/pull/10297#discussion_r1437324192
##
hudi-client/hudi-client-common/src/main/java/org/apache/hudi/io/HoodieAppendHandle.java:
##
@@ -294,6 +295,9 @@ private Option prepareRecord(HoodieRecord
hoodieRec
hudi-bot commented on PR #10422:
URL: https://github.com/apache/hudi/pull/10422#issuecomment-1870745088
## CI report:
* 99517e23baa60a6a0602e9daf7f522f3c1dcfa1e UNKNOWN
Bot commands
@hudi-bot supports the following commands:
- `@hudi-bot run azure` re-run th
young138120 opened a new issue, #10423:
URL: https://github.com/apache/hudi/issues/10423
**Describe the problem you faced**
I am currently writing data to a MOR partitioned table using Flink. When
reading with Spark, I want to ignore newly added uncompressed log data
Due to conflicts i
jonvex commented on code in PR #10422:
URL: https://github.com/apache/hudi/pull/10422#discussion_r1437320766
##
docker/compose/docker-compose_hadoop284_hive233_spark244_mac_aarch64.yml:
##
@@ -129,10 +129,11 @@ services:
- ./hadoop.env
environment:
SERVICE_PRE
[
https://issues.apache.org/jira/browse/HUDI-6787?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
ASF GitHub Bot updated HUDI-6787:
-
Labels: pull-request-available (was: )
> Integrate FileGroupReader with HoodieMergeOnReadSnapshot
jonvex opened a new pull request, #10422:
URL: https://github.com/apache/hudi/pull/10422
### Change Logs
Replace existing hive read logic with filegroup reader
### Impact
hive will be more maintainable
### Risk level (write none, low medium or high below)
h
young138120 closed issue #10327: [SUPPORT] Flink to hudi ,but the partitioned
table query is not expected
URL: https://github.com/apache/hudi/issues/10327
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
young138120 commented on issue #10327:
URL: https://github.com/apache/hudi/issues/10327#issuecomment-1870738635
It should be caused by different versions
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
[
https://issues.apache.org/jira/browse/HUDI-6787?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jonathan Vexler reassigned HUDI-6787:
-
Assignee: Jonathan Vexler
> Integrate FileGroupReader with HoodieMergeOnReadSnapshotReade
hudi-bot commented on PR #10416:
URL: https://github.com/apache/hudi/pull/10416#issuecomment-1870715451
## CI report:
* d8a1fc4adc069bdb337f05708f8a0e95f20d23ff Azure:
[SUCCESS](https://dev.azure.com/apache-hudi-ci-org/785b6ef4-2f42-4a89-8f0e-5f0d7039a0cc/_build/results?buildId=21
1 - 100 of 171 matches
Mail list logo