This is an automated email from the ASF dual-hosted git repository. lzljs3620320 pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-paimon.git
The following commit(s) were added to refs/heads/master by this push: new 26219796a [doc] Document download for Oss and S3 (#688) 26219796a is described below commit 26219796a7bbbd783629837820e5e6a057735b5f Author: Jingsong Lee <jingsongl...@gmail.com> AuthorDate: Thu Mar 23 16:39:51 2023 +0800 [doc] Document download for Oss and S3 (#688) --- docs/content/filesystems/oss.md | 27 ++++++++++++++++++--------- docs/content/filesystems/s3.md | 27 ++++++++++++++++++--------- 2 files changed, 36 insertions(+), 18 deletions(-) diff --git a/docs/content/filesystems/oss.md b/docs/content/filesystems/oss.md index 5ace5b34a..8338ee76d 100644 --- a/docs/content/filesystems/oss.md +++ b/docs/content/filesystems/oss.md @@ -28,16 +28,17 @@ under the License. {{< stable >}} -## Download - -[Download](https://repo.maven.apache.org/maven2/org/apache/flink/paimon-oss/{{< version >}}/paimon-oss-{{< version >}}.jar) -flink paimon shaded jar. +Download [paimon-oss-{{< version >}}.jar](https://www.apache.org/dyn/closer.lua/flink/paimon-{{< version >}}/paimon-oss-{{< version >}}.jar). {{< /stable >}} {{< unstable >}} -## Build +Download [paimon-oss-{{< version >}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-oss/{{< version >}}/). + +{{< /unstable >}} + +You can also manually build bundled jar from the source code. To build from source code, [clone the git repository]({{< github_repo >}}). @@ -50,14 +51,14 @@ mvn clean install -DskipTests You can find the shaded jars under `./paimon-filesystems/paimon-oss/target/paimon-oss-{{< version >}}.jar`. -{{< /unstable >}} - -## Usage - {{< tabs "oss" >}} {{< tab "Flink" >}} +{{< hint info >}} +If you have already configured oss access through Flink (Via Flink FileSystem), here you can skip the following configuration. +{{< /hint >}} + Put `paimon-oss-{{< version >}}.jar` into `lib` directory of your Flink home, and create catalog: ```sql @@ -74,6 +75,10 @@ CREATE CATALOG my_catalog WITH ( {{< tab "Spark" >}} +{{< hint info >}} +If you have already configured oss access through Spark (Via Hadoop FileSystem), here you can skip the following configuration. +{{< /hint >}} + Place `paimon-oss-{{< version >}}.jar` together with `paimon-spark-{{< version >}}.jar` under Spark's jars directory, and start like ```shell @@ -89,6 +94,10 @@ spark-sql \ {{< tab "Hive" >}} +{{< hint info >}} +If you have already configured oss access through Hive (Via Hadoop FileSystem), here you can skip the following configuration. +{{< /hint >}} + NOTE: You need to ensure that Hive metastore can access `oss`. Place `paimon-oss-{{< version >}}.jar` together with `paimon-hive-connector-{{< version >}}.jar` under Hive's auxlib directory, and start like diff --git a/docs/content/filesystems/s3.md b/docs/content/filesystems/s3.md index c8467a227..7c497ab45 100644 --- a/docs/content/filesystems/s3.md +++ b/docs/content/filesystems/s3.md @@ -28,16 +28,17 @@ under the License. {{< stable >}} -## Download - -[Download](https://repo.maven.apache.org/maven2/org/apache/flink/paimon-s3/{{< version >}}/paimon-s3-{{< version >}}.jar) -flink paimon shaded jar. +Download [paimon-s3-{{< version >}}.jar](https://www.apache.org/dyn/closer.lua/flink/paimon-{{< version >}}/paimon-s3-{{< version >}}.jar). {{< /stable >}} {{< unstable >}} -## Build +Download [paimon-s3-{{< version >}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-s3/{{< version >}}/). + +{{< /unstable >}} + +You can also manually build bundled jar from the source code. To build from source code, [clone the git repository]({{< github_repo >}}). @@ -50,14 +51,14 @@ mvn clean install -DskipTests You can find the shaded jars under `./paimon-filesystems/paimon-s3/target/paimon-s3-{{< version >}}.jar`. -{{< /unstable >}} - -## Usage - {{< tabs "oss" >}} {{< tab "Flink" >}} +{{< hint info >}} +If you have already configured s3 access through Flink (Via Flink FileSystem), here you can skip the following configuration. +{{< /hint >}} + Put `paimon-s3-{{< version >}}.jar` into `lib` directory of your Flink home, and create catalog: ```sql @@ -74,6 +75,10 @@ CREATE CATALOG my_catalog WITH ( {{< tab "Spark" >}} +{{< hint info >}} +If you have already configured s3 access through Spark (Via Hadoop FileSystem), here you can skip the following configuration. +{{< /hint >}} + Place `paimon-s3-{{< version >}}.jar` together with `paimon-spark-{{< version >}}.jar` under Spark's jars directory, and start like ```shell @@ -89,6 +94,10 @@ spark-sql \ {{< tab "Hive" >}} +{{< hint info >}} +If you have already configured s3 access through Hive ((Via Hadoop FileSystem)), here you can skip the following configuration. +{{< /hint >}} + NOTE: You need to ensure that Hive metastore can access `s3`. Place `paimon-s3-{{< version >}}.jar` together with `paimon-hive-connector-{{< version >}}.jar` under Hive's auxlib directory, and start like