This is an automated email from the ASF dual-hosted git repository.
yuxia pushed a commit to branch release-0.8
in repository https://gitbox.apache.org/repos/asf/fluss.git
The following commit(s) were added to refs/heads/release-0.8 by this push:
new c6d7d8c08 [hotfix] Fix iceberg quickstart sql (#1964) (#1968)
c6d7d8c08 is described below
commit c6d7d8c0857ddcd643cca5d234ae6490613a3181
Author: xx789 <[email protected]>
AuthorDate: Wed Nov 12 10:28:18 2025 +0800
[hotfix] Fix iceberg quickstart sql (#1964) (#1968)
---
website/docs/quickstart/lakehouse.md | 44 ++++++++++++++++++++++++++++++++++++
1 file changed, 44 insertions(+)
diff --git a/website/docs/quickstart/lakehouse.md
b/website/docs/quickstart/lakehouse.md
index b63e504d6..06719c0df 100644
--- a/website/docs/quickstart/lakehouse.md
+++ b/website/docs/quickstart/lakehouse.md
@@ -332,6 +332,10 @@ For further information how to store catalog
configurations, see [Flink's Catalo
:::
### Create Tables
+<Tabs groupId="lake-tabs">
+ <TabItem value="paimon" label="Paimon" default>
+
+
Running the following SQL to create Fluss tables to be used in this guide:
```sql title="Flink SQL"
CREATE TABLE fluss_order (
@@ -366,6 +370,46 @@ CREATE TABLE fluss_nation (
);
```
+ </TabItem>
+
+ <TabItem value="iceberg" label="Iceberg">
+
+
+Running the following SQL to create Fluss tables to be used in this guide:
+```sql title="Flink SQL"
+CREATE TABLE fluss_order (
+ `order_key` BIGINT,
+ `cust_key` INT NOT NULL,
+ `total_price` DECIMAL(15, 2),
+ `order_date` DATE,
+ `order_priority` STRING,
+ `clerk` STRING,
+ `ptime` AS PROCTIME()
+);
+```
+
+```sql title="Flink SQL"
+CREATE TABLE fluss_customer (
+ `cust_key` INT NOT NULL,
+ `name` STRING,
+ `phone` STRING,
+ `nation_key` INT NOT NULL,
+ `acctbal` DECIMAL(15, 2),
+ `mktsegment` STRING,
+ PRIMARY KEY (`cust_key`) NOT ENFORCED
+);
+```
+
+```sql title="Flink SQL"
+CREATE TABLE fluss_nation (
+ `nation_key` INT NOT NULL,
+ `name` STRING,
+ PRIMARY KEY (`nation_key`) NOT ENFORCED
+);
+```
+
+ </TabItem>
+</Tabs>
## Streaming into Fluss
First, run the following SQL to sync data from source tables to Fluss tables: