[I] [Bug]MySqlSource An error occurred when adding a table to the table.include.list attribute [flink-cdc]

2024-03-06 Thread via GitHub


xiahuya opened a new issue, #3112:
URL: https://github.com/apache/flink-cdc/issues/3112

   Flink-cdc Version : 2.3.0 
   
   When I was using MySqlSource, when starting the program for the first time, 
the 'table.include.list'=sqluser.pa_adm,and enable checkpoint,The data in the 
table ‘sqluser.pa_adm’  is normal.
   
   When I modify 'table.include.list'=sqluser.pa_adm,sqluser.pa_person,program 
recovery from savepoint,The program started reporting an error and was unable 
to capture data for sqluser.pa_person,throwing exceptions 'Encountered change 
event 'Event{header=EventHeaderV4{timestamp=170917   7391000, 
eventType=TABLE_MAP, serverId=1, headerLength=19, dataLength=117, 
nextPosition=769436194, flags=0}, data=TableMapEventData{tableId=5303, 
database='sqluser', table='pa_person', columnTypes=8, 15, 18, 18, 18, 18, 18, 
18, 18, 18, 18, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 15, 1, 15, 
columnMetadata=0, 192,0, 0, 0, 0, 0, 0, 0, 0, 0, 192, 96, 96, 96, 96, 
96, 384, 96, 96, 384, 30, 30, 30, 30, 0, 96, columnNullability={5, 6, 7, 8, 9, 
10, 11, 12, 13, 14, 15, 16, 17, 18, 1   9, 20, 21, 22, 23, 24, 25, 26}, 
eventMetadata=TableMapEventMetadata{signedness={1}, defaultCharset=33, 
charsetCollations=null, columnCharsets=null, columnNames=null,
setStrValues=null, en
 umStrValues=null, geometryTypes=null, simplePrimaryKeys=null, 
primaryKeysWithPrefix=null, enumAndSetDefaultCharset=null, 
enumAndSetColumnCharse   ts=null,visibility=null}}}' at offset 
{transaction_id=null, ts_sec=1709177391, file=binlog.000476, pos=769435520, 
server_id=1, event=3} for table sqluser.pa_person whose schema isn't known to 
this connector. One possible cause is an incomplete database history topic. 
Take a new snapshot in this case.'
   
   I will read the source code later,When the program is restarted and restored 
from state, the schema of the  added table sqluser.pa_person no exists
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] [Doc] restart pipeline job from a specific savepoint file. [flink-cdc]

2024-03-06 Thread via GitHub


lvyanquan commented on issue #2940:
URL: https://github.com/apache/flink-cdc/issues/2940#issuecomment-1982640707

   closed and traced it in  https://issues.apache.org/jira/browse/FLINK-34613.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] [Doc] restart pipeline job from a specific savepoint file. [flink-cdc]

2024-03-06 Thread via GitHub


lvyanquan closed issue #2940: [Doc] restart pipeline job from a specific 
savepoint file.
URL: https://github.com/apache/flink-cdc/issues/2940


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [cdc-cli] support recover from a specific savepoint file [flink-cdc]

2024-03-06 Thread via GitHub


lvyanquan commented on PR #2959:
URL: https://github.com/apache/flink-cdc/pull/2959#issuecomment-1982621460

   Rebased to master.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Flink CDC 3.1.0 Plan [flink-cdc]

2024-03-06 Thread via GitHub


leonardBang commented on issue #2861:
URL: https://github.com/apache/flink-cdc/issues/2861#issuecomment-1982460840

   > hi @leonardBang , do you have plan or roadmap for module 
flink-cdc-pipeline-connectors to support pipeline with MongoDB source !
   
   Yes, it's on the plan but maybe we consider it in next version


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [FLINK-34180] Migrate doc website from ververica to flink [flink-cdc]

2024-03-06 Thread via GitHub


PatrickRen commented on code in PR #3028:
URL: https://github.com/apache/flink-cdc/pull/3028#discussion_r1515585603


##
.dlc.json:
##
@@ -0,0 +1,32 @@
+{

Review Comment:
   What about we introduce this file after we enable the dead link check in the 
future? I'm not sure links in this file currently are related to Flink CDC's 
doc.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink-cdc) branch gh-pages updated (18544edd2 -> eb63c4e1c)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch gh-pages
in repository https://gitbox.apache.org/repos/asf/flink-cdc.git


 discard 18544edd2 Generated docs from commit 
e10c8691add2cfedfe859a5e51115a08212d7cf0
 new eb63c4e1c Generated docs from commit 
86272bf1029022adbf6d34132f4b34df14f2ad89

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (18544edd2)
\
 N -- N -- N   refs/heads/gh-pages (eb63c4e1c)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 NOTICE   |   5 -
 master/.doctrees/environment.pickle  | Bin 238146 -> 238146 bytes
 release-1.4/.doctrees/environment.pickle | Bin 44175 -> 44175 bytes
 release-2.0/.doctrees/environment.pickle | Bin 48679 -> 48679 bytes
 release-2.1/.doctrees/environment.pickle | Bin 92968 -> 92968 bytes
 release-2.2/.doctrees/environment.pickle | Bin 152545 -> 152545 bytes
 release-2.3/.doctrees/environment.pickle | Bin 160641 -> 160641 bytes
 release-2.4/.doctrees/environment.pickle | Bin 181093 -> 181093 bytes
 release-3.0/.doctrees/environment.pickle | Bin 237316 -> 237316 bytes
 9 files changed, 4 insertions(+), 1 deletion(-)



(flink-cdc) branch master updated: [FLINK-34183][cdc][license] Update NOTICE files (#3107)

2024-03-06 Thread jiabaosun
This is an automated email from the ASF dual-hosted git repository.

jiabaosun pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink-cdc.git


The following commit(s) were added to refs/heads/master by this push:
 new 86272bf10 [FLINK-34183][cdc][license] Update NOTICE files (#3107)
86272bf10 is described below

commit 86272bf1029022adbf6d34132f4b34df14f2ad89
Author: Hang Ruan 
AuthorDate: Thu Mar 7 11:37:58 2024 +0800

[FLINK-34183][cdc][license] Update NOTICE files (#3107)
---
 NOTICE  | 17 ++---
 docs/site/NOTICE|  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 .../src/main/resources/META-INF/NOTICE  |  5 -
 9 files changed, 46 insertions(+), 11 deletions(-)

diff --git a/NOTICE b/NOTICE
index 8f1d8491f..df01cd954 100644
--- a/NOTICE
+++ b/NOTICE
@@ -1,3 +1,14 @@
-flink-cdc-connectors
-Copyright 2023 Ververica Inc.
-Apache Flink, Flink®, Apache®, the squirrel logo, and the Apache feather logo 
are either registered trademarks or trademarks of The Apache Software 
Foundation.
\ No newline at end of file
+Apache Flink CDC
+Copyright 2024 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
+
+Permission to use, copy, modify, and/or distribute this software for any 
purpose with or without fee is hereby
+granted, provided that this permission notice appear in all copies.
+
+THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH 
REGARD TO THIS SOFTWARE INCLUDING
+ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE 
AUTHOR BE LIABLE FOR ANY SPECIAL,
+DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING 
FROM LOSS OF USE, DATA OR PROFITS,
+WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING 
OUT OF OR IN CONNECTION WITH THE
+USE OR PERFORMANCE OF THIS SOFTWARE.
diff --git a/docs/site/NOTICE b/docs/site/NOTICE
index de44b032f..8f32cb027 100644
--- a/docs/site/NOTICE
+++ b/docs/site/NOTICE
@@ -1,5 +1,8 @@
 Flink CDC Site
-Copyright 2023 Ververica Inc.
+Copyright 2024 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
 
 This project bundles the following dependencies under the MIT license.
 See bundled license files for details.
diff --git 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-db2-cdc/src/main/resources/META-INF/NOTICE
 
b/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-db2-cdc/src/main/resources/META-INF/NOTICE
index b8b0ac45f..071a4c8d0 100644
--- 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-db2-cdc/src/main/resources/META-INF/NOTICE
+++ 
b/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-db2-cdc/src/main/resources/META-INF/NOTICE
@@ -1,5 +1,8 @@
 flink-sql-connector-db2-cdc
-Copyright 2020 Ververica Inc.
+Copyright 2024 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
 
 This project bundles the following dependencies under the Apache Software 
License 2.0. (http://www.apache.org/licenses/LICENSE-2.0.txt)
 
diff --git 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mongodb-cdc/src/main/resources/META-INF/NOTICE
 
b/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mongodb-cdc/src/main/resources/META-INF/NOTICE
index 2f9c97539..7fc96c2c3 100644
--- 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mongodb-cdc/src/main/resources/META-INF/NOTICE
+++ 
b/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mongodb-cdc/src/main/resources/META-INF/NOTICE
@@ -1,5 +1,8 @@
 flink-sql-connector-mongodb-cdc
-Copyright 2020 Ververica Inc.
+Copyright 2024 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).
 
 This project bundles the following dependencies under the Apache Software 
License 2.0. (http://www.apache.org/licenses/LICENSE-2.0.txt)
 
diff --git 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mysql-cdc/src/main/resources/META-INF/NOTICE
 
b/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mysql-cdc/src/main/resources/META-INF/NOTICE
index 84b59d497..f34582c1d 100644
--- 
a/flink-cdc-connect/flink-cdc-source-connectors/flink-sql-connector-mysql-cdc/src/main/resour

Re: [PR] [FLINK-34183] Update NOTICE files [flink-cdc]

2024-03-06 Thread via GitHub


Jiabao-Sun merged PR #3107:
URL: https://github.com/apache/flink-cdc/pull/3107


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] The document button on the web homepage cannot be accessed [flink-cdc]

2024-03-06 Thread via GitHub


xmzhou00 closed issue #3111: The document button on the web homepage cannot be 
accessed
URL: https://github.com/apache/flink-cdc/issues/3111


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] Flink CDC 3.1.0 Plan [flink-cdc]

2024-03-06 Thread via GitHub


viethung2281996 commented on issue #2861:
URL: https://github.com/apache/flink-cdc/issues/2861#issuecomment-1982215293

   hi @leonardBang , do you have plan or roadmap for module 
flink-cdc-pipeline-connectors to support pipeline with MongoDB source !


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [cdc-e2e] Add a debug entry. [flink-cdc]

2024-03-06 Thread via GitHub


joyCurry30 closed pull request #2814: [cdc-e2e] Add a debug entry.
URL: https://github.com/apache/flink-cdc/pull/2814


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink-connector-mongodb) branch main updated: [hotfix] Add IssueNavigationLink for IDEA git log (#30)

2024-03-06 Thread jiabaosun
This is an automated email from the ASF dual-hosted git repository.

jiabaosun pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-mongodb.git


The following commit(s) were added to refs/heads/main by this push:
 new da7472a  [hotfix] Add IssueNavigationLink for IDEA git log (#30)
da7472a is described below

commit da7472a69a786582d33081f7424dc6a5bcd5e172
Author: gongzhongqiang 
AuthorDate: Thu Mar 7 09:56:08 2024 +0800

[hotfix] Add IssueNavigationLink for IDEA git log (#30)
---
 .idea/vcs.xml | 22 --
 1 file changed, 20 insertions(+), 2 deletions(-)

diff --git a/.idea/vcs.xml b/.idea/vcs.xml
index 35eb1dd..6c7c9bb 100644
--- a/.idea/vcs.xml
+++ b/.idea/vcs.xml
@@ -1,6 +1,24 @@
 
 
+  
+
+  
+
+  
+  https://issues.apache.org/jira/browse/$0"; />
+
+
+  
+  https://cwiki.apache.org/confluence/display/FLINK/$0"; />
+
+
+  
+  https://github.com/apache/flink-connector-mongodb/pull/$1"; />
+
+  
+
+  
   
-
+
   
-
\ No newline at end of file
+



(flink-connector-jdbc) branch main updated: [hotfix] Correct CrateDBDialectTypeTest package

2024-03-06 Thread jiabaosun
This is an automated email from the ASF dual-hosted git repository.

jiabaosun pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-jdbc.git


The following commit(s) were added to refs/heads/main by this push:
 new 95294ffb [hotfix] Correct CrateDBDialectTypeTest package
95294ffb is described below

commit 95294ffbc57c93c2af34cda94c27fc5781e06177
Author: gongzhongqiang <764629...@qq.com>
AuthorDate: Thu Mar 7 09:27:02 2024 +0800

[hotfix] Correct CrateDBDialectTypeTest package
---
 .../cratedb => databases/cratedb/dialect}/CrateDBDialectTypeTest.java   | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/dialect/cratedb/CrateDBDialectTypeTest.java
 
b/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/databases/cratedb/dialect/CrateDBDialectTypeTest.java
similarity index 97%
rename from 
flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/dialect/cratedb/CrateDBDialectTypeTest.java
rename to 
flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/databases/cratedb/dialect/CrateDBDialectTypeTest.java
index 563adb2d..8f95af48 100644
--- 
a/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/dialect/cratedb/CrateDBDialectTypeTest.java
+++ 
b/flink-connector-jdbc/src/test/java/org/apache/flink/connector/jdbc/databases/cratedb/dialect/CrateDBDialectTypeTest.java
@@ -16,7 +16,7 @@
  * limitations under the License.
  */
 
-package org.apache.flink.connector.jdbc.dialect.cratedb;
+package org.apache.flink.connector.jdbc.databases.cratedb.dialect;
 
 import org.apache.flink.connector.jdbc.dialect.JdbcDialectTypeTest;
 



(flink-connector-hbase) branch dependabot/maven/org.apache.commons-commons-compress-1.26.0 deleted (was 319cfdd)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/org.apache.commons-commons-compress-1.26.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


 was 319cfdd  Bump org.apache.commons:commons-compress from 1.23.0 to 1.26.0

The revisions that were on this branch are still contained in
other references; therefore, this change does not discard any commits
from the repository.



(flink-connector-hbase) branch main updated (9cbc109 -> dfe7646)

2024-03-06 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


from 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42
 add dfe7646  [FLINK-34575] Bump org.apache.commons:commons-compress from 
1.23.0 to 1.26.0. This closes #41

No new revisions were added by this update.

Summary of changes:
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



(flink-connector-hbase) branch dependabot/maven/org.apache.zookeeper-zookeeper-3.7.2 updated (91c0a80 -> 248ba58)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/org.apache.zookeeper-zookeeper-3.7.2
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


 discard 91c0a80  Bump org.apache.zookeeper:zookeeper from 3.4.14 to 3.7.2
 add 42870d5  [hotfix] Remove 1.19-SNAPSHOT since HBase currently can't 
compile for that version
 add 91d166d  [hotfix] Update copyright year to 2024
 add 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42
 add 248ba58  Bump org.apache.zookeeper:zookeeper from 3.4.14 to 3.7.2

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (91c0a80)
\
 N -- N -- N   
refs/heads/dependabot/maven/org.apache.zookeeper-zookeeper-3.7.2 (248ba58)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 .github/workflows/push_pr.yml  |2 +-
 .github/workflows/weekly.yml   |3 -
 docs/content.zh/docs/connectors/table/hbase.md |3 +-
 docs/content/docs/connectors/table/hbase.md|3 +-
 docs/data/hbase.yml|2 -
 .../3ba92d3a-6609-4295-92ed-f2fe207ee2b3   |0
 .../ffbddcc3-857a-4af7-a6b5-fcf71e2cc191   |0
 .../archunit-violations/stored.rules   |4 -
 flink-connector-hbase-1.4/pom.xml  |  443 --
 .../hbase1/HBase1DynamicTableFactory.java  |  186 ---
 .../hbase1/sink/HBaseDynamicTableSink.java |  143 --
 .../hbase1/source/AbstractTableInputFormat.java|  313 
 .../hbase1/source/HBaseDynamicTableSource.java |   78 -
 .../hbase1/source/HBaseRowDataInputFormat.java |   96 --
 .../org.apache.flink.table.factories.Factory   |   16 -
 .../architecture/TestCodeArchitectureTest.java |   40 -
 .../connector/hbase1/HBaseConnectorITCase.java |  758 --
 .../hbase1/HBaseDynamicTableFactoryTest.java   |  348 -
 .../flink/connector/hbase1/HBaseTablePlanTest.java |  138 --
 .../flink/connector/hbase1/util/HBaseTestBase.java |  307 
 .../util/HBaseTestingClusterAutoStarter.java   |  193 ---
 .../java/org/slf4j/impl/Log4jLoggerAdapter.java|   22 -
 .../src/test/resources/archunit.properties |   31 -
 .../src/test/resources/hbase-site.xml  |   29 -
 .../src/test/resources/log4j2-test.properties  |   28 -
 .../flink/connector/hbase1/HBaseTablePlanTest.xml  |   36 -
 flink-connector-hbase-base/pom.xml |   10 +-
 flink-connector-hbase-e2e-tests/pom.xml|   15 -
 .../apache/flink/streaming/tests/HBaseITCase.java  |   24 +-
 flink-sql-connector-hbase-1.4/pom.xml  |  157 --
 .../src/main/resources/META-INF/NOTICE |   63 -
 .../resources/META-INF/licenses/LICENSE.protobuf   |   32 -
 .../src/main/resources/hbase-default.xml   | 1558 
 .../src/main/resources/META-INF/NOTICE |2 +-
 pom.xml|   25 +-
 35 files changed, 42 insertions(+), 5066 deletions(-)
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/3ba92d3a-6609-4295-92ed-f2fe207ee2b3
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/ffbddcc3-857a-4af7-a6b5-fcf71e2cc191
 delete mode 100644 flink-connector-hbase-1.4/archunit-violations/stored.rules
 delete mode 100644 flink-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/HBase1DynamicTableFactory.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/sink/HBaseDynamicTableSink.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/AbstractTableInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseDynamicTableSource.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseRowDataInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/architecture/TestCodeArchitectureTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/fl

(flink-connector-hbase) branch dependabot/maven/org.apache.commons-commons-compress-1.26.0 updated (8ed7aeb -> 319cfdd)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/org.apache.commons-commons-compress-1.26.0
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


 discard 8ed7aeb  Bump org.apache.commons:commons-compress from 1.23.0 to 1.26.0
 add 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42
 add 319cfdd  Bump org.apache.commons:commons-compress from 1.23.0 to 1.26.0

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (8ed7aeb)
\
 N -- N -- N   
refs/heads/dependabot/maven/org.apache.commons-commons-compress-1.26.0 (319cfdd)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 docs/content.zh/docs/connectors/table/hbase.md |3 +-
 docs/content/docs/connectors/table/hbase.md|3 +-
 docs/data/hbase.yml|2 -
 .../3ba92d3a-6609-4295-92ed-f2fe207ee2b3   |0
 .../ffbddcc3-857a-4af7-a6b5-fcf71e2cc191   |0
 .../archunit-violations/stored.rules   |4 -
 flink-connector-hbase-1.4/pom.xml  |  443 --
 .../hbase1/HBase1DynamicTableFactory.java  |  186 ---
 .../hbase1/sink/HBaseDynamicTableSink.java |  143 --
 .../hbase1/source/AbstractTableInputFormat.java|  313 
 .../hbase1/source/HBaseDynamicTableSource.java |   78 -
 .../hbase1/source/HBaseRowDataInputFormat.java |   96 --
 .../org.apache.flink.table.factories.Factory   |   16 -
 .../architecture/TestCodeArchitectureTest.java |   40 -
 .../connector/hbase1/HBaseConnectorITCase.java |  758 --
 .../hbase1/HBaseDynamicTableFactoryTest.java   |  348 -
 .../flink/connector/hbase1/HBaseTablePlanTest.java |  138 --
 .../flink/connector/hbase1/util/HBaseTestBase.java |  307 
 .../util/HBaseTestingClusterAutoStarter.java   |  193 ---
 .../java/org/slf4j/impl/Log4jLoggerAdapter.java|   22 -
 .../src/test/resources/archunit.properties |   31 -
 .../src/test/resources/hbase-site.xml  |   29 -
 .../src/test/resources/log4j2-test.properties  |   28 -
 .../flink/connector/hbase1/HBaseTablePlanTest.xml  |   36 -
 flink-connector-hbase-base/pom.xml |   10 +-
 flink-connector-hbase-e2e-tests/pom.xml|   15 -
 .../apache/flink/streaming/tests/HBaseITCase.java  |   24 +-
 flink-sql-connector-hbase-1.4/pom.xml  |  157 --
 .../src/main/resources/META-INF/NOTICE |   63 -
 .../resources/META-INF/licenses/LICENSE.protobuf   |   32 -
 .../src/main/resources/hbase-default.xml   | 1558 
 pom.xml|   25 +-
 32 files changed, 40 insertions(+), 5061 deletions(-)
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/3ba92d3a-6609-4295-92ed-f2fe207ee2b3
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/ffbddcc3-857a-4af7-a6b5-fcf71e2cc191
 delete mode 100644 flink-connector-hbase-1.4/archunit-violations/stored.rules
 delete mode 100644 flink-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/HBase1DynamicTableFactory.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/sink/HBaseDynamicTableSink.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/AbstractTableInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseDynamicTableSource.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseRowDataInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/architecture/TestCodeArchitectureTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseConnectorITCase.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseDynamicTableFactoryTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseTablePlanTest.java
 delete mode 100644 
flin

(flink-connector-hbase) branch dependabot/maven/flink-connector-hbase-base/com.google.guava-guava-32.0.0-jre created (now 2cf525c)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch 
dependabot/maven/flink-connector-hbase-base/com.google.guava-guava-32.0.0-jre
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


  at 2cf525c  Bump com.google.guava:guava in /flink-connector-hbase-base

No new revisions were added by this update.



(flink-connector-hbase) branch main updated (91d166d -> 9cbc109)

2024-03-06 Thread martijnvisser
This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/flink-connector-hbase.git


from 91d166d  [hotfix] Update copyright year to 2024
 add 9cbc109  [FLINK-34413] Remove HBase 1.x connector files and deps. This 
closes #42

No new revisions were added by this update.

Summary of changes:
 docs/content.zh/docs/connectors/table/hbase.md |3 +-
 docs/content/docs/connectors/table/hbase.md|3 +-
 docs/data/hbase.yml|2 -
 .../3ba92d3a-6609-4295-92ed-f2fe207ee2b3   |0
 .../ffbddcc3-857a-4af7-a6b5-fcf71e2cc191   |0
 .../archunit-violations/stored.rules   |4 -
 flink-connector-hbase-1.4/pom.xml  |  443 --
 .../hbase1/HBase1DynamicTableFactory.java  |  186 ---
 .../hbase1/sink/HBaseDynamicTableSink.java |  143 --
 .../hbase1/source/AbstractTableInputFormat.java|  313 
 .../hbase1/source/HBaseDynamicTableSource.java |   78 -
 .../hbase1/source/HBaseRowDataInputFormat.java |   96 --
 .../org.apache.flink.table.factories.Factory   |   16 -
 .../architecture/TestCodeArchitectureTest.java |   40 -
 .../connector/hbase1/HBaseConnectorITCase.java |  758 --
 .../hbase1/HBaseDynamicTableFactoryTest.java   |  348 -
 .../flink/connector/hbase1/HBaseTablePlanTest.java |  138 --
 .../flink/connector/hbase1/util/HBaseTestBase.java |  307 
 .../util/HBaseTestingClusterAutoStarter.java   |  193 ---
 .../java/org/slf4j/impl/Log4jLoggerAdapter.java|   22 -
 .../src/test/resources/archunit.properties |   31 -
 .../src/test/resources/hbase-site.xml  |   29 -
 .../src/test/resources/log4j2-test.properties  |   28 -
 .../flink/connector/hbase1/HBaseTablePlanTest.xml  |   36 -
 flink-connector-hbase-base/pom.xml |   10 +-
 flink-connector-hbase-e2e-tests/pom.xml|   15 -
 .../apache/flink/streaming/tests/HBaseITCase.java  |   24 +-
 flink-sql-connector-hbase-1.4/pom.xml  |  157 --
 .../src/main/resources/META-INF/NOTICE |   63 -
 .../resources/META-INF/licenses/LICENSE.protobuf   |   32 -
 .../src/main/resources/hbase-default.xml   | 1558 
 pom.xml|   25 +-
 32 files changed, 40 insertions(+), 5061 deletions(-)
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/3ba92d3a-6609-4295-92ed-f2fe207ee2b3
 delete mode 100644 
flink-connector-hbase-1.4/archunit-violations/ffbddcc3-857a-4af7-a6b5-fcf71e2cc191
 delete mode 100644 flink-connector-hbase-1.4/archunit-violations/stored.rules
 delete mode 100644 flink-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/HBase1DynamicTableFactory.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/sink/HBaseDynamicTableSink.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/AbstractTableInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseDynamicTableSource.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/java/org/apache/flink/connector/hbase1/source/HBaseRowDataInputFormat.java
 delete mode 100644 
flink-connector-hbase-1.4/src/main/resources/META-INF/services/org.apache.flink.table.factories.Factory
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/architecture/TestCodeArchitectureTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseConnectorITCase.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseDynamicTableFactoryTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/HBaseTablePlanTest.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/util/HBaseTestBase.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/apache/flink/connector/hbase1/util/HBaseTestingClusterAutoStarter.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/java/org/slf4j/impl/Log4jLoggerAdapter.java
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/archunit.properties
 delete mode 100644 flink-connector-hbase-1.4/src/test/resources/hbase-site.xml
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/log4j2-test.properties
 delete mode 100644 
flink-connector-hbase-1.4/src/test/resources/org/apache/flink/connector/hbase1/HBaseTablePlanTest.xml
 delete mode 100644 flink-sql-connector-hbase-1.4/pom.xml
 delete mode 100644 
flink-sql-connector-hbase-1.4/src/main/resources/META-INF/NOTICE
 delete mode 100644 
flink-sql-connector-hbase-1.4/sr

svn commit: r67758 - /dev/flink/flink-1.19.0-rc1/

2024-03-06 Thread lincoln
Author: lincoln
Date: Wed Mar  6 16:16:20 2024
New Revision: 67758

Log:
Remove flink-1.19.0-rc1

Removed:
dev/flink/flink-1.19.0-rc1/



Re: [I] Error on Postgres-CDC using incremental snapshot with UUID column as PK [flink-cdc]

2024-03-06 Thread via GitHub


olivier-derom commented on issue #3108:
URL: https://github.com/apache/flink-cdc/issues/3108#issuecomment-1980957733

   I figured this is a limitation of Postgres and not Flink-CDC, so tried to 
create my own min(uuid, uuid) and max(uuid, uuid) as well as aggregate 
min(uuid) and max(uuid).
   They work perfectly when I execute the SQL statement myself on the database, 
but Flink-CDC still says 'function min(uuid) does not exist'.
   Does it somehow not have access to custom functions?
   The database connection in Flink-CDC is made using the same credentials as I 
created the function with, so the user should have access.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink) branch release-1.19 updated: Revert "[FLINK-33532][network] Move the serialization of ShuffleDescriptorGroup out of the RPC main thread]"

2024-03-06 Thread zhuzh
This is an automated email from the ASF dual-hosted git repository.

zhuzh pushed a commit to branch release-1.19
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.19 by this push:
 new 837f8e58485 Revert "[FLINK-33532][network] Move the serialization of 
ShuffleDescriptorGroup out of the RPC main thread]"
837f8e58485 is described below

commit 837f8e584850bdcbc586a54f58e3fe953a816be8
Author: caodizhou 
AuthorDate: Wed Mar 6 14:11:56 2024 +0800

Revert "[FLINK-33532][network] Move the serialization of 
ShuffleDescriptorGroup out of the RPC main thread]"

This reverts commit d18a4bfe596fc580f8280750fa3bfa22007671d9.

(cherry picked from commit 7a709bf323c1cce3440887fe937311bae119aab0)
---
 .../org/apache/flink/runtime/blob/BlobWriter.java  | 11 ++--
 .../deployment/CachedShuffleDescriptors.java   |  2 +-
 .../deployment/InputGateDeploymentDescriptor.java  | 41 ++-
 .../deployment/TaskDeploymentDescriptor.java   | 19 ---
 .../TaskDeploymentDescriptorFactory.java   | 58 --
 .../deployment/CachedShuffleDescriptorsTest.java   | 30 ++-
 .../TaskDeploymentDescriptorTestUtils.java |  9 ++--
 .../partition/consumer/SingleInputGateTest.java|  6 ++-
 8 files changed, 83 insertions(+), 93 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java 
b/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
index 555cccfb7ca..2d5292b42cb 100644
--- a/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
+++ b/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
@@ -28,7 +28,6 @@ import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
 import java.io.InputStream;
-import java.util.Optional;
 
 /** BlobWriter is used to upload data to the BLOB store. */
 public interface BlobWriter {
@@ -103,13 +102,11 @@ public interface BlobWriter {
 if (serializedValue.getByteArray().length < 
blobWriter.getMinOffloadingSize()) {
 return Either.Left(serializedValue);
 } else {
-return offloadWithException(serializedValue, jobId, blobWriter)
-.map(Either::, PermanentBlobKey>Right)
-.orElse(Either.Left(serializedValue));
+return offloadWithException(serializedValue, jobId, blobWriter);
 }
 }
 
-static  Optional offloadWithException(
+static  Either, PermanentBlobKey> 
offloadWithException(
 SerializedValue serializedValue, JobID jobId, BlobWriter 
blobWriter) {
 Preconditions.checkNotNull(serializedValue);
 Preconditions.checkNotNull(jobId);
@@ -117,10 +114,10 @@ public interface BlobWriter {
 try {
 final PermanentBlobKey permanentBlobKey =
 blobWriter.putPermanent(jobId, 
serializedValue.getByteArray());
-return Optional.of(permanentBlobKey);
+return Either.Right(permanentBlobKey);
 } catch (IOException e) {
 LOG.warn("Failed to offload value for job {} to BLOB store.", 
jobId, e);
-return Optional.empty();
+return Either.Left(serializedValue);
 }
 }
 }
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
index 4ddacbd671a..b8e0b44006f 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
@@ -87,7 +87,7 @@ public class CachedShuffleDescriptors {
 new ShuffleDescriptorGroup(
 toBeSerialized.toArray(new 
ShuffleDescriptorAndIndex[0]));
 MaybeOffloaded 
serializedShuffleDescriptorGroup =
-
shuffleDescriptorSerializer.trySerializeAndOffloadShuffleDescriptor(
+
shuffleDescriptorSerializer.serializeAndTryOffloadShuffleDescriptor(
 shuffleDescriptorGroup, numConsumers);
 
 toBeSerialized.clear();
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
index 4e02c699331..333a91e0a73 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
@@ -23,7 +23,7 @@ import org.apache.flink.api.common.JobID;
 import org.apache.flink.runtime.blob.PermanentBlobKey;
 import org.apache.flink.runtime.blob.PermanentBlobService;
 import 
org.apache.flink.runtime.deployment.TaskDepl

Re: [PR] [Feature-2932][Pipeline] Flink CDC pipeline supports transform [flink-cdc]

2024-03-06 Thread via GitHub


leonardBang commented on code in PR #2937:
URL: https://github.com/apache/flink-cdc/pull/2937#discussion_r1514480625


##
flink-cdc-common/src/main/java/com/ververica/cdc/common/utils/ThreadLocalCache.java:
##
@@ -0,0 +1,86 @@
+/*
+ * Copyright 2023 Ververica Inc.
+ *
+ * Licensed under the Apache License, Version 2.0 (the "License");
+ * you may not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package com.ververica.cdc.common.utils;
+
+import org.apache.flink.annotation.Internal;
+
+import java.util.LinkedHashMap;
+import java.util.Map;
+import java.util.function.Function;
+
+/**
+ * Provides a thread local cache with a maximum cache size per thread.
+ *
+ * Note: Values must not be null.
+ */
+@Internal
+public abstract class ThreadLocalCache {
+
+private static final int DEFAULT_CACHE_SIZE = 64;
+
+private final ThreadLocal> cache = new ThreadLocal<>();
+private final int maxSizePerThread;
+
+protected ThreadLocalCache() {
+this(DEFAULT_CACHE_SIZE);
+}
+
+protected ThreadLocalCache(int maxSizePerThread) {
+this.maxSizePerThread = maxSizePerThread;
+}
+
+public V get(K key) {
+BoundedMap map = cache.get();
+if (map == null) {
+map = new BoundedMap<>(maxSizePerThread);
+cache.set(map);
+}
+V value = map.get(key);
+if (value == null) {
+value = getNewInstance(key);
+map.put(key, value);
+}
+return value;
+}
+
+public abstract V getNewInstance(K key);
+
+private static class BoundedMap extends LinkedHashMap {
+
+private static final long serialVersionUID = -211630219014422361L;

Review Comment:
   1L



##
flink-cdc-common/src/main/java/com/ververica/cdc/common/utils/DateTimeUtils.java:
##
@@ -0,0 +1,120 @@
+/*
+ * Copyright 2023 Ververica Inc.

Review Comment:
   please rebase to fix the copyright.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] [hotfix] Change old com.ververica dependency to flink [flink-cdc]

2024-03-06 Thread via GitHub


xleoken opened a new pull request, #3110:
URL: https://github.com/apache/flink-cdc/pull/3110

   Change old com.ververica dependency to flink.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] [Bug] cdc 3.0.1和flink 1.16.2 集成存在依赖冲入问题 [flink-cdc]

2024-03-06 Thread via GitHub


WuChongYong commented on issue #3071:
URL: https://github.com/apache/flink-cdc/issues/3071#issuecomment-1980762450

   you can use the flink 1.18.0 version
   
   
   
    回复的原邮件 
   | 发件人 | Hongshun ***@***.***> |
   | 日期 | 2024年03月06日 20:25 |
   | 收件人 | ***@***.***> |
   | 抄送至 | ***@***.***>***@***.***> |
   | 主题 | Re: [apache/flink-cdc] [Bug] cdc 3.0.1和flink 1.16.2 集成存在依赖冲入问题 (Issue 
#3071) |
   
   @WuChongYong , @YyItRoad , hi, Please give me your code so that I can 
reproduce it.
   
   —
   Reply to this email directly, view it on GitHub, or unsubscribe.
   You are receiving this because you were mentioned.Message ID: ***@***.***>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] [Bug] cdc 3.0.1和flink 1.16.2 集成存在依赖冲入问题 [flink-cdc]

2024-03-06 Thread via GitHub


loserwang1024 commented on issue #3071:
URL: https://github.com/apache/flink-cdc/issues/3071#issuecomment-1980758546

   @WuChongYong , @YyItRoad , hi, Please give me your code so that I can 
reproduce it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink) branch master updated (76cce1e9d35 -> 9b1375520b6)

2024-03-06 Thread jingge
This is an automated email from the ASF dual-hosted git repository.

jingge pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


from 76cce1e9d35 [FLINK-34401][docs-zh] Translate "Flame Graphs" page into 
Chinese (#24279)
 add 9b1375520b6 Modify obvious errors in the doc.

No new revisions were added by this update.

Summary of changes:
 docs/content.zh/docs/try-flink/table_api.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)



(flink) branch master updated: [FLINK-34401][docs-zh] Translate "Flame Graphs" page into Chinese (#24279)

2024-03-06 Thread jingge
This is an automated email from the ASF dual-hosted git repository.

jingge pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 76cce1e9d35 [FLINK-34401][docs-zh] Translate "Flame Graphs" page into 
Chinese (#24279)
76cce1e9d35 is described below

commit 76cce1e9d351eda4e76096707e1bc4302b200922
Author: lxliyou001 <47881938+lxliyou...@users.noreply.github.com>
AuthorDate: Wed Mar 6 18:58:20 2024 +0800

[FLINK-34401][docs-zh] Translate "Flame Graphs" page into Chinese (#24279)

[FLINK-34401][docs-zh] Translate "Flame Graphs" page into Chinese

Co-authored-by: Zakelly 
---
 docs/content.zh/docs/ops/debugging/flame_graphs.md | 44 +++---
 .../flink/runtime/blob/PermanentBlobCache.java |  2 +-
 2 files changed, 23 insertions(+), 23 deletions(-)

diff --git a/docs/content.zh/docs/ops/debugging/flame_graphs.md 
b/docs/content.zh/docs/ops/debugging/flame_graphs.md
index 6a030dff12e..90a90c2fe10 100644
--- a/docs/content.zh/docs/ops/debugging/flame_graphs.md
+++ b/docs/content.zh/docs/ops/debugging/flame_graphs.md
@@ -25,38 +25,39 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-# Flame Graphs
+# 火焰图
 
-[Flame Graphs](http://www.brendangregg.com/flamegraphs.html) are a 
visualization that effectively surfaces answers to questions like:
-- Which methods are currently consuming CPU resources?
-- How does consumption by one method compare to the others?
-- Which series of calls on the stack led to executing a particular method?
+[Flame Graphs](http://www.brendangregg.com/flamegraphs.html) 
是一种有效的可视化工具,可以回答以下问题:
+
+- 目前哪些方法正在消耗 CPU 资源?
+- 一个方法的消耗与其他方法相比如何?
+- 哪一系列的堆栈调用导致了特定方法的执行?
 
 {{< img src="/fig/flame_graph_on_cpu.png" class="img-fluid" width="90%" >}}
 {{% center %}}
 Flame Graph
 {{% /center %}}
 
-Flame Graphs are constructed by sampling stack traces a number of times. Each 
method call is presented by a bar, where the length of the bar is proportional 
to the number of times it is present in the samples.
+火焰图是通过多次采样堆栈跟踪来构建的。每个方法调用都由一个条形图表示,其中条形图的长度与其在样本中出现的次数成比例。
 
-Starting with Flink 1.13, Flame Graphs are natively supported in Flink. In 
order to produce a Flame Graph, navigate to the job graph of a running job, 
select an operator of interest and in the menu to the right click on the Flame 
Graph tab:  
+从 Flink 1.13 版本开始支持火焰图。要生成一个火焰图,请导航到正在运行的作业图,选择感兴趣的算子,并在右侧菜单中点击 "Flame Graph" 
选项卡: 
 
 {{< img src="/fig/flame_graph_operator.png" class="img-fluid" width="90%" >}}
 {{% center %}}
-Operator's On-CPU Flame Graph
+算子级别的 On-CPU 火焰图
 {{% /center %}}
 
 {{< hint warning >}}
 
-Any measurement process in and of itself inevitably affects the subject of 
measurement (see the [double-split 
experiment](https://en.wikipedia.org/wiki/Double-slit_experiment#Relational_interpretation)).
 Sampling CPU stack traces is no exception. In order to prevent unintended 
impacts on production environments, Flame Graphs are currently available as an 
opt-in feature. To enable it, you'll need to set [`rest.flamegraph.enabled: 
true`]({{< ref "docs/deployment/config">}}#rest-flamegraph- [...]
+任何测量过程本身不可避免地会影响被测对象(参考 [double-split 
experiment](https://en.wikipedia.org/wiki/Double-slit_experiment#Relational_interpretation))。对CPU堆栈跟踪进行采样也不例外。为了防止对生产环境产生意外影响,火焰图目前作为一项选择性功能可用。要启用它,你需要设置
 [`rest.flamegraph.enabled: true`]({{< ref 
"docs/deployment/config">}}#rest-flamegraph-enabled) in [Flink configuration 
file]({{< ref "docs/deployment/config#flink-配置文件" 
>}})。我们建议在开发和预生产环境中启用它,但在生产环境中请将其视为实验性功能。
 
 {{< /hint >}}
 
-Apart from the On-CPU Flame Graphs, 
[Off-CPU](http://www.brendangregg.com/FlameGraphs/offcpuflamegraphs.html) and 
Mixed visualizations are available and can be switched between by using the 
selector at the top of the pane:
+除了 On-CPU 火焰图之外, 
[Off-CPU](http://www.brendangregg.com/FlameGraphs/offcpuflamegraphs.html) 
还有混合可视化模式可供选择,并可以通过面板顶部的选择器进行切换:
 
 {{< img src="/fig/flame_graph_selector.png" class="img-fluid" width="30%" >}}
 
-The Off-CPU Flame Graph visualizes blocking calls found in the samples. A 
distinction is made as follows:
+Off-CPU 火焰图可视化了在样本中找到的阻塞调用。按如下方式进行区分:
 - On-CPU: `Thread.State` in **[RUNNABLE, NEW]**
 - Off-CPU: `Thread.State` in **[TIMED_WAITING, WAITING, BLOCKED]**
 
@@ -65,26 +66,25 @@ The Off-CPU Flame Graph visualizes blocking calls found in 
the samples. A distin
 Off-CPU Flame Graph
 {{% /center %}}
 
-Mixed mode Flame Graphs are constructed from stack traces of threads in all 
possible states.
+混合模式的火焰图是由处于所有可能状态的线程的堆栈跟踪构建而成。
 
 {{< img src="/fig/flame_graph_mixed.png" class="img-fluid" width="90%" >}}
 {{% center %}}
-Flame Graph in Mixed Mode
+混合模式的火焰图
 {{% /center %}}
 
-##  Sampling process
+##  采样过程
 
-The collection of stack traces is done purely within the JVM, so only method 
calls within the Java runtime are visible (no system calls).
+堆栈跟踪的收集纯粹在 JVM 内部进行,因此只能看到 Java 运行时内的方法调用(看不到系统调用)

(flink) branch master updated: [FLINK-34493][table] Migrate ReplaceMinusWithAntiJoinRule to java.

2024-03-06 Thread dwysakowicz
This is an automated email from the ASF dual-hosted git repository.

dwysakowicz pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 6a12668bcfe [FLINK-34493][table] Migrate ReplaceMinusWithAntiJoinRule 
to java.
6a12668bcfe is described below

commit 6a12668bcfe651fa938517eb2da4d537ce6ce668
Author: liuyongvs 
AuthorDate: Fri Mar 1 16:08:52 2024 +0800

[FLINK-34493][table] Migrate ReplaceMinusWithAntiJoinRule to java.
---
 .../logical/ReplaceMinusWithAntiJoinRule.java  | 95 ++
 .../logical/ReplaceMinusWithAntiJoinRule.scala | 65 ---
 2 files changed, 95 insertions(+), 65 deletions(-)

diff --git 
a/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/rules/logical/ReplaceMinusWithAntiJoinRule.java
 
b/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/rules/logical/ReplaceMinusWithAntiJoinRule.java
new file mode 100644
index 000..35c719e3846
--- /dev/null
+++ 
b/flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/rules/logical/ReplaceMinusWithAntiJoinRule.java
@@ -0,0 +1,95 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.rules.logical;
+
+import org.apache.calcite.plan.RelOptRuleCall;
+import org.apache.calcite.plan.RelRule;
+import org.apache.calcite.rel.RelNode;
+import org.apache.calcite.rel.core.JoinRelType;
+import org.apache.calcite.rel.core.Minus;
+import org.apache.calcite.rel.core.RelFactories;
+import org.apache.calcite.rex.RexNode;
+import org.apache.calcite.tools.RelBuilder;
+import org.apache.calcite.util.Util;
+import org.immutables.value.Value;
+
+import java.util.List;
+
+import static 
org.apache.flink.table.planner.plan.utils.SetOpRewriteUtil.generateEqualsCondition;
+
+/**
+ * Planner rule that replaces distinct {@link 
org.apache.calcite.rel.core.Minus} (SQL keyword:
+ * EXCEPT) with a distinct {@link org.apache.calcite.rel.core.Aggregate} on an 
ANTI {@link
+ * org.apache.calcite.rel.core.Join}.
+ *
+ * Only handle the case of input size 2.
+ */
+@Value.Enclosing
+public class ReplaceMinusWithAntiJoinRule
+extends 
RelRule {
+
+public static final ReplaceMinusWithAntiJoinRule INSTANCE =
+
ReplaceMinusWithAntiJoinRule.ReplaceMinusWithAntiJoinRuleConfig.DEFAULT.toRule();
+
+private ReplaceMinusWithAntiJoinRule(ReplaceMinusWithAntiJoinRuleConfig 
config) {
+super(config);
+}
+
+@Override
+public boolean matches(RelOptRuleCall call) {
+Minus minus = call.rel(0);
+return !minus.all && minus.getInputs().size() == 2;
+}
+
+@Override
+public void onMatch(RelOptRuleCall call) {
+Minus minus = call.rel(0);
+RelNode left = minus.getInput(0);
+RelNode right = minus.getInput(1);
+
+RelBuilder relBuilder = call.builder();
+List keys = Util.range(left.getRowType().getFieldCount());
+List conditions = generateEqualsCondition(relBuilder, left, 
right, keys);
+
+relBuilder.push(left);
+relBuilder.push(right);
+relBuilder
+.join(JoinRelType.ANTI, conditions)
+.aggregate(
+
relBuilder.groupKey(keys.stream().mapToInt(Integer::intValue).toArray()));
+RelNode rel = relBuilder.build();
+call.transformTo(rel);
+}
+
+/** Rule configuration. */
+@Value.Immutable(singleton = false)
+public interface ReplaceMinusWithAntiJoinRuleConfig extends RelRule.Config 
{
+ReplaceMinusWithAntiJoinRule.ReplaceMinusWithAntiJoinRuleConfig 
DEFAULT =
+
ImmutableReplaceMinusWithAntiJoinRule.ReplaceMinusWithAntiJoinRuleConfig.builder()
+.build()
+.withOperandSupplier(b0 -> 
b0.operand(Minus.class).anyInputs())
+.withRelBuilderFactory(RelFactories.LOGICAL_BUILDER)
+.withDescription("ReplaceMinusWithAntiJoinRule");
+
+@Override
+default ReplaceMinusWithAntiJoinRule toRule() {
+return new R

Re: [I] [Bug] flink-cdc-connector-oracle 全量同步表总是少一条数据 [flink-cdc]

2024-03-06 Thread via GitHub


lfift commented on issue #2567:
URL: https://github.com/apache/flink-cdc/issues/2567#issuecomment-1980450772

   Is there a primary key in the table?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [I] The Document URL in README.md file was invalid [flink-cdc]

2024-03-06 Thread via GitHub


hnbian commented on issue #3105:
URL: https://github.com/apache/flink-cdc/issues/3105#issuecomment-1980415779

   https://github.com/apache/flink-cdc/pull/3106


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink) branch master updated: Revert "[FLINK-33532][network] Move the serialization of ShuffleDescriptorGroup out of the RPC main thread]"

2024-03-06 Thread guoyangze
This is an automated email from the ASF dual-hosted git repository.

guoyangze pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
 new 7a709bf323c Revert "[FLINK-33532][network] Move the serialization of 
ShuffleDescriptorGroup out of the RPC main thread]"
7a709bf323c is described below

commit 7a709bf323c1cce3440887fe937311bae119aab0
Author: caodizhou 
AuthorDate: Wed Mar 6 14:11:56 2024 +0800

Revert "[FLINK-33532][network] Move the serialization of 
ShuffleDescriptorGroup out of the RPC main thread]"

This reverts commit d18a4bfe596fc580f8280750fa3bfa22007671d9.
---
 .../org/apache/flink/runtime/blob/BlobWriter.java  | 11 ++--
 .../deployment/CachedShuffleDescriptors.java   |  2 +-
 .../deployment/InputGateDeploymentDescriptor.java  | 41 ++-
 .../deployment/TaskDeploymentDescriptor.java   | 19 ---
 .../TaskDeploymentDescriptorFactory.java   | 58 --
 .../deployment/CachedShuffleDescriptorsTest.java   | 30 ++-
 .../TaskDeploymentDescriptorTestUtils.java |  9 ++--
 .../partition/consumer/SingleInputGateTest.java|  6 ++-
 8 files changed, 83 insertions(+), 93 deletions(-)

diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java 
b/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
index 555cccfb7ca..2d5292b42cb 100644
--- a/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
+++ b/flink-runtime/src/main/java/org/apache/flink/runtime/blob/BlobWriter.java
@@ -28,7 +28,6 @@ import org.slf4j.LoggerFactory;
 
 import java.io.IOException;
 import java.io.InputStream;
-import java.util.Optional;
 
 /** BlobWriter is used to upload data to the BLOB store. */
 public interface BlobWriter {
@@ -103,13 +102,11 @@ public interface BlobWriter {
 if (serializedValue.getByteArray().length < 
blobWriter.getMinOffloadingSize()) {
 return Either.Left(serializedValue);
 } else {
-return offloadWithException(serializedValue, jobId, blobWriter)
-.map(Either::, PermanentBlobKey>Right)
-.orElse(Either.Left(serializedValue));
+return offloadWithException(serializedValue, jobId, blobWriter);
 }
 }
 
-static  Optional offloadWithException(
+static  Either, PermanentBlobKey> 
offloadWithException(
 SerializedValue serializedValue, JobID jobId, BlobWriter 
blobWriter) {
 Preconditions.checkNotNull(serializedValue);
 Preconditions.checkNotNull(jobId);
@@ -117,10 +114,10 @@ public interface BlobWriter {
 try {
 final PermanentBlobKey permanentBlobKey =
 blobWriter.putPermanent(jobId, 
serializedValue.getByteArray());
-return Optional.of(permanentBlobKey);
+return Either.Right(permanentBlobKey);
 } catch (IOException e) {
 LOG.warn("Failed to offload value for job {} to BLOB store.", 
jobId, e);
-return Optional.empty();
+return Either.Left(serializedValue);
 }
 }
 }
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
index 4ddacbd671a..b8e0b44006f 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/CachedShuffleDescriptors.java
@@ -87,7 +87,7 @@ public class CachedShuffleDescriptors {
 new ShuffleDescriptorGroup(
 toBeSerialized.toArray(new 
ShuffleDescriptorAndIndex[0]));
 MaybeOffloaded 
serializedShuffleDescriptorGroup =
-
shuffleDescriptorSerializer.trySerializeAndOffloadShuffleDescriptor(
+
shuffleDescriptorSerializer.serializeAndTryOffloadShuffleDescriptor(
 shuffleDescriptorGroup, numConsumers);
 
 toBeSerialized.clear();
diff --git 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
index 4e02c699331..333a91e0a73 100644
--- 
a/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
+++ 
b/flink-runtime/src/main/java/org/apache/flink/runtime/deployment/InputGateDeploymentDescriptor.java
@@ -23,7 +23,7 @@ import org.apache.flink.api.common.JobID;
 import org.apache.flink.runtime.blob.PermanentBlobKey;
 import org.apache.flink.runtime.blob.PermanentBlobService;
 import 
org.apache.flink.runtime.deployment.TaskDeploymentDescriptor.MaybeOffloaded;
-import 
org.apache.flink.runtime.deployment.TaskDepl

Re: [PR] [fix] repair a bug: Sometimes we will get a big chunk when use splitOneUnevenlySizedChunk [flink-cdc]

2024-03-06 Thread via GitHub


AidenPerce commented on PR #2915:
URL: https://github.com/apache/flink-cdc/pull/2915#issuecomment-1980384760

   > > > Please resolve the conflicts and add some ut cases for it.
   > > 
   > > 
   > > qwq! Sorry, I will close this PR, there are some problems, when a mysql 
use "utf8mb4_general_ci" but a table/database use "utf8mb4_bin", the data in 
table `test` likes: | field1 | |  | |  | |  |
   > > `SELECT MAX(field1) FROM test` will get "", but `SELECT 
STRCMP('', '')` will get -1, because the collation and Case-Sensitive, 
i will find other ways to deal with this problem.
   > 
   > please how do you fix it? i have same question.
   
   I has added a method to compare the value through sql to repair this 
problem,  the pr is  #2968   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[I] Error on Postgres-CDC using incremental snapshot with UUID column as PK [flink-cdc]

2024-03-06 Thread via GitHub


olivier-derom opened a new issue, #3108:
URL: https://github.com/apache/flink-cdc/issues/3108

   A majority of our Postgres databases use UUIDs as primary keys.
   When we enable 'scan.incremental.snapshot.enabled = true', Flink-CDC will 
try to split into chunks.
   The splitTableIntoChunks function relies on the queryMinMax function, which 
fails when trying to calculate the MIN(UUID), as that is not supported in 
Postgres.
   
   Is there a way around this?
   
   When we convert our column to VARCHAR, rather than UUID, everything seems to 
work.
   We did not find a way to cast our UUIDs to VARCHAR while splitting them into 
chunks without editing the source code or altering the source table.
   
   Disabling incremental snapshots also fixes the issue, as we do not split 
into chunks anymore, but this would mean we get a global read lock on the data 
before snapshot reading, which we want to avoid.
   
   Thanks in advance for the help!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [FLINK-34180] Migrate doc website from ververica to flink [flink-cdc]

2024-03-06 Thread via GitHub


GOODBOY008 commented on code in PR #3028:
URL: https://github.com/apache/flink-cdc/pull/3028#discussion_r1514025693


##
.dlc.json:
##
@@ -0,0 +1,32 @@
+{

Review Comment:
   This file I would like to check dead link in doc,but curretly do not active
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [FLINK-34180] Migrate doc website from ververica to flink [flink-cdc]

2024-03-06 Thread via GitHub


GOODBOY008 commented on code in PR #3028:
URL: https://github.com/apache/flink-cdc/pull/3028#discussion_r1514025693


##
.dlc.json:
##
@@ -0,0 +1,32 @@
+{

Review Comment:
   This file I would like to check dead link in doc
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] [FLINK-34183] Update NOTICE files [flink-cdc]

2024-03-06 Thread via GitHub


ruanhang1993 opened a new pull request, #3107:
URL: https://github.com/apache/flink-cdc/pull/3107

   This PR updates NOTICE files in Flink CDC.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[PR] Solve the problem of invalid document URL address in README.md file [flink-cdc]

2024-03-06 Thread via GitHub


hnbian opened a new pull request, #3106:
URL: https://github.com/apache/flink-cdc/pull/3106

   https://github.com/apache/flink-cdc/issues/3105
   
   Solve the problem of invalid document URL address in README.md file
   Modify the URL in the file that points to the document address


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[I] The Document URL in README.md file was [flink-cdc]

2024-03-06 Thread via GitHub


hnbian opened a new issue, #3105:
URL: https://github.com/apache/flink-cdc/issues/3105

   
   https://github.com/apache/flink-cdc/assets/12964828/fde93c8c-76c0-4fa0-9f31-7d3fd9977c2b";>
   https://github.com/apache/flink-cdc/assets/12964828/e56f3719-1877-4939-85f3-c5c4f489af43";>
   https://github.com/apache/flink-cdc/assets/12964828/3811f690-a815-4004-8824-b7cce3a75bf2";>
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



Re: [PR] [FLINK-34180] Migrate doc website from ververica to flink [flink-cdc]

2024-03-06 Thread via GitHub


PatrickRen commented on code in PR #3028:
URL: https://github.com/apache/flink-cdc/pull/3028#discussion_r1513993227


##
.dlc.json:
##
@@ -0,0 +1,32 @@
+{

Review Comment:
   What's the purpose of this file?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



(flink-cdc) branch gh-pages updated (f84c2b4b5 -> 18544edd2)

2024-03-06 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a change to branch gh-pages
in repository https://gitbox.apache.org/repos/asf/flink-cdc.git


 discard f84c2b4b5 Generated docs from commit 
a6c1b06e11004918cb6e5714ade3699e052e1aad
 new 18544edd2 Generated docs from commit 
e10c8691add2cfedfe859a5e51115a08212d7cf0

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (f84c2b4b5)
\
 N -- N -- N   refs/heads/gh-pages (18544edd2)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../.doctrees/content/connectors/db2-cdc.doctree   | Bin 39188 -> 39194 bytes
 .../content/connectors/mongodb-cdc(ZH).doctree | Bin 84508 -> 84532 bytes
 .../content/connectors/mongodb-cdc.doctree | Bin 88716 -> 88740 bytes
 .../content/connectors/mysql-cdc(ZH).doctree   | Bin 142726 -> 142738 bytes
 .../.doctrees/content/connectors/mysql-cdc.doctree | Bin 149567 -> 149579 bytes
 .../content/connectors/oceanbase-cdc(ZH).doctree   | Bin 78785 -> 78809 bytes
 .../content/connectors/oceanbase-cdc.doctree   | Bin 79888 -> 79912 bytes
 .../content/connectors/oracle-cdc.doctree  | Bin 76362 -> 76398 bytes
 .../content/connectors/postgres-cdc.doctree| Bin 65800 -> 65836 bytes
 .../content/connectors/sqlserver-cdc.doctree   | Bin 56695 -> 56731 bytes
 .../.doctrees/content/connectors/tidb-cdc.doctree  | Bin 39822 -> 39846 bytes
 .../content/connectors/vitess-cdc.doctree  | Bin 35771 -> 35783 bytes
 .../content/overview/cdc-connectors.doctree| Bin 102465 -> 102477 bytes
 .../datastream-api-package-guidance.doctree| Bin 28646 -> 28694 bytes
 .../datastream-api-package-guidance-zh.doctree"| Bin 28653 -> 28701 bytes
 master/.doctrees/environment.pickle| Bin 238146 -> 238146 bytes
 master/_sources/content/connectors/db2-cdc.md.txt  |   2 +-
 .../content/connectors/mongodb-cdc(ZH).md.txt  |   8 
 .../_sources/content/connectors/mongodb-cdc.md.txt |   8 
 .../content/connectors/mysql-cdc(ZH).md.txt|   4 ++--
 .../_sources/content/connectors/mysql-cdc.md.txt   |   4 ++--
 .../content/connectors/oceanbase-cdc(ZH).md.txt|   8 
 .../content/connectors/oceanbase-cdc.md.txt|   8 
 .../_sources/content/connectors/oracle-cdc.md.txt  |  12 ++--
 .../content/connectors/postgres-cdc.md.txt |  12 ++--
 .../content/connectors/sqlserver-cdc.md.txt|  12 ++--
 master/_sources/content/connectors/tidb-cdc.md.txt |   8 
 .../_sources/content/connectors/vitess-cdc.md.txt  |   4 ++--
 .../content/overview/cdc-connectors.md.txt |   4 ++--
 .../datastream-api-package-guidance.md.txt |  16 
 .../datastream-api-package-guidance-zh.md.txt" |  16 
 master/content/connectors/db2-cdc.html |   2 +-
 master/content/connectors/mongodb-cdc(ZH).html |   8 
 master/content/connectors/mongodb-cdc.html |   8 
 master/content/connectors/mysql-cdc(ZH).html   |   4 ++--
 master/content/connectors/mysql-cdc.html   |   4 ++--
 master/content/connectors/oceanbase-cdc(ZH).html   |   8 
 master/content/connectors/oceanbase-cdc.html   |   8 
 master/content/connectors/oracle-cdc.html  |  12 ++--
 master/content/connectors/postgres-cdc.html|  12 ++--
 master/content/connectors/sqlserver-cdc.html   |  12 ++--
 master/content/connectors/tidb-cdc.html|   8 
 master/content/connectors/vitess-cdc.html  |   4 ++--
 master/content/overview/cdc-connectors.html|   4 ++--
 .../datastream-api-package-guidance.html   |  16 
 .../datastream-api-package-guidance-zh.html"   |  16 
 release-1.4/.doctrees/environment.pickle   | Bin 44175 -> 44175 bytes
 release-2.0/.doctrees/environment.pickle   | Bin 48679 -> 48679 bytes
 release-2.1/.doctrees/environment.pickle   | Bin 92968 -> 92968 bytes
 release-2.2/.doctrees/environment.pickle   | Bin 152545 -> 152545 bytes
 release-2.3/.doctrees/environment.pickle   | Bin 160641 -> 160641 byt

Re: [PR] [FLINK-34584] rename package com.ververica.cdc to org.apache.flink.cdc [flink-cdc]

2024-03-06 Thread via GitHub


leonardBang merged PR #3089:
URL: https://github.com/apache/flink-cdc/pull/3089


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org