This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a change to branch release-1.1
in repository https://gitbox.apache.org/repos/asf/paimon.git


    from 403c6b705c [parquet] Fix timestamp type and decimal type, if the file 
schema is not correctly match the schema in metadata (#5582)
     new e72f3d7fcf [core] Postpone bucket should introduce a new BucketMode 
(#5592)
     new 1bb2f6027f [core] Fix dynamic insert into table with partition columns 
contain primary key error (#5588)

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .../java/org/apache/paimon/table/BucketMode.java   | 10 +++++++++-
 .../java/org/apache/paimon/KeyValueFileStore.java  | 18 +++++++++++------
 .../paimon/crosspartition/GlobalIndexAssigner.java |  6 +++---
 .../paimon/crosspartition/IndexBootstrap.java      | 13 ++++++------
 .../KeyPartPartitionKeyExtractor.java              |  5 ++---
 .../paimon/table/AbstractFileStoreTable.java       |  4 ++++
 .../paimon/table/PrimaryKeyFileStoreTable.java     | 11 -----------
 ...ava => RowPartitionAllPrimaryKeyExtractor.java} | 17 +++++++++-------
 .../paimon/crosspartition/IndexBootstrapTest.java  |  2 +-
 .../flink/lookup/PrimaryKeyPartialLookupTable.java |  5 +++--
 .../paimon/flink/sink/CompactorSinkBuilder.java    |  1 -
 .../apache/paimon/flink/sink/FlinkSinkBuilder.java |  8 +++-----
 .../paimon/flink/FlinkJobRecoveryITCase.java       |  3 +++
 .../scala/org/apache/paimon/spark/SparkTable.scala |  3 +++
 .../paimon/spark/commands/PaimonSparkWriter.scala  |  3 +--
 .../spark/sql/InsertOverwriteTableTestBase.scala   | 23 ++++++++++++++++++++++
 16 files changed, 84 insertions(+), 48 deletions(-)
 copy 
paimon-core/src/main/java/org/apache/paimon/table/sink/{RowPartitionKeyExtractor.java
 => RowPartitionAllPrimaryKeyExtractor.java} (78%)

Reply via email to