[ 
https://issues.apache.org/jira/browse/HUDI-1771?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17384039#comment-17384039
 ] 

ASF GitHub Bot commented on HUDI-1771:
--------------------------------------

danny0405 commented on a change in pull request #3285:
URL: https://github.com/apache/hudi/pull/3285#discussion_r672172626



##########
File path: 
hudi-flink/src/main/java/org/apache/hudi/table/format/mor/MergeOnReadInputFormat.java
##########
@@ -624,6 +634,14 @@ public boolean reachedEnd() throws IOException {
       return true;
     }
 
+    private Option<IndexedRecord> getInsetValue(String curKey) throws 
IOException {
+      final HoodieRecord<?> record = logRecords.get(curKey);
+      if (HoodieOperation.isDelete(record.getOperation())) {
+        return Option.empty();

Review comment:
       Put `!emitDelete` in the front.

##########
File path: 
hudi-flink/src/test/java/org/apache/hudi/table/format/TestInputFormat.java
##########
@@ -175,13 +176,32 @@ void testReadWithDeletes() throws Exception {
 
     List<RowData> result = readData(inputFormat);
 
-    final String actual = TestData.rowDataToString(result);
+    final String actual = TestData.rowDataToString(result, true);
     final String expected = "["
-        + "id1,Danny,24,1970-01-01T00:00:00.001,par1, "
-        + "id2,Stephen,34,1970-01-01T00:00:00.002,par1, "
-        + "id3,null,null,null,null, "
-        + "id5,null,null,null,null, "
-        + "id9,null,null,null,null]";
+        + "+I(id1,Danny,24,1970-01-01T00:00:00.001,par1), "
+        + "+I(id2,Stephen,34,1970-01-01T00:00:00.002,par1), "
+        + "-D(id3,Julian,53,1970-01-01T00:00:00.003,par2), "
+        + "-D(id5,Sophia,18,1970-01-01T00:00:00.005,par3), "
+        + "-D(id9,Jane,19,1970-01-01T00:00:00.006,par3)]";
+    assertThat(actual, is(expected));
+  }
+
+  @Test
+  void testReadWithDeletesCOW() throws Exception {
+    beforeEach(HoodieTableType.COPY_ON_WRITE);
+
+    // write another commit to read again
+    TestData.writeData(TestData.DATA_SET_UPDATE_DELETE, conf);
+
+    InputFormat<RowData, ?> inputFormat = this.tableSource.getInputFormat();
+    assertThat(inputFormat, instanceOf(CopyOnWriteInputFormat.class));
+
+    List<RowData> result = readData(inputFormat);
+
+    final String actual = TestData.rowDataToString(result, true);
+    final String expected = "["

Review comment:
       I think spark may need to set up the change flag column correctly when 
write into hoodie, but actually spark `InternalRow` does not support builtin 
change flag, use may need to specify a column explicitly.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Propagate CDC format for hoodie
> -------------------------------
>
>                 Key: HUDI-1771
>                 URL: https://issues.apache.org/jira/browse/HUDI-1771
>             Project: Apache Hudi
>          Issue Type: New Feature
>          Components: Flink Integration
>            Reporter: Danny Chen
>            Assignee: Zheng yunhong
>            Priority: Major
>              Labels: pull-request-available, sev:normal
>             Fix For: 0.9.0
>
>
> Like what we discussed in the dev mailing list: 
> https://lists.apache.org/thread.html/r31b2d1404e4e043a5f875b78105ba6f9a801e78f265ad91242ad5eb2%40%3Cdev.hudi.apache.org%3E
> Keep the change flags make new use cases possible: using HUDI as the unified 
> storage format for DWD and DWS layer.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to