This is an automated email from the ASF dual-hosted git repository.
peacewong pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/linkis-website.git
The following commit(s) were added to refs/heads/dev by this push:
new b20a6d2bc0e add hive on tez (#748)
b20a6d2bc0e is described below
commit b20a6d2bc0ebdf8a15432b8e65b7d48869cf5e2d
Author: peacewong <[email protected]>
AuthorDate: Tue Aug 15 11:48:44 2023 +0800
add hive on tez (#748)
* update sit to default
* update hive on tez doc
---
docs/deployment/install-engineconn.md | 2 +-
docs/engine-usage/hive.md | 6 ++++--
.../current/deployment/install-engineconn.md | 2 +-
.../docusaurus-plugin-content-docs/current/engine-usage/hive.md | 7 +++++--
.../version-0.11.0/deployment/engine-conn-plugin-installation.md | 2 +-
.../version-1.2.0/engine-usage/elasticsearch.md | 2 +-
.../version-1.2.0/engine-usage/flink.md | 4 ++--
.../version-1.2.0/engine-usage/jdbc.md | 2 +-
.../version-1.2.0/engine-usage/openlookeng.md | 2 +-
.../version-1.2.0/engine-usage/pipeline.md | 2 +-
.../version-1.2.0/engine-usage/presto.md | 2 +-
.../version-1.2.0/engine-usage/sqoop.md | 2 +-
.../version-1.3.0/deployment/install-engineconn.md | 2 +-
.../version-1.3.0/engine-usage/elasticsearch.md | 4 ++--
.../version-1.3.0/engine-usage/flink.md | 6 +++---
.../version-1.3.0/engine-usage/jdbc.md | 4 ++--
.../version-1.3.0/engine-usage/openlookeng.md | 4 ++--
.../version-1.3.0/engine-usage/pipeline.md | 4 ++--
.../version-1.3.0/engine-usage/presto.md | 4 ++--
.../version-1.3.0/engine-usage/sqoop.md | 4 ++--
.../version-1.3.1/deployment/install-engineconn.md | 2 +-
.../version-1.3.2/deployment/install-engineconn.md | 2 +-
.../version-1.4.0/deployment/install-engineconn.md | 2 +-
.../version-0.11.0/deployment/engine-conn-plugin-installation.md | 2 +-
.../version-1.0.2/deployment/engine-conn-plugin-installation.md | 2 +-
.../version-1.0.3/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.0.3/engine-usage/flink.md | 2 +-
.../version-1.1.0/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.1.0/engine-usage/flink.md | 2 +-
.../version-1.1.1/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.1.1/engine-usage/flink.md | 2 +-
versioned_docs/version-1.1.1/engine-usage/pipeline.md | 2 +-
.../version-1.1.2/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.1.2/engine-usage/flink.md | 2 +-
versioned_docs/version-1.1.2/engine-usage/pipeline.md | 2 +-
versioned_docs/version-1.1.2/engine-usage/sqoop.md | 2 +-
.../version-1.1.3/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.1.3/engine-usage/flink.md | 2 +-
versioned_docs/version-1.1.3/engine-usage/pipeline.md | 2 +-
versioned_docs/version-1.1.3/engine-usage/sqoop.md | 2 +-
.../version-1.2.0/deployment/engine-conn-plugin-installation.md | 2 +-
versioned_docs/version-1.2.0/engine-usage/flink.md | 2 +-
versioned_docs/version-1.2.0/engine-usage/pipeline.md | 2 +-
versioned_docs/version-1.2.0/engine-usage/presto.md | 2 +-
versioned_docs/version-1.2.0/engine-usage/sqoop.md | 2 +-
versioned_docs/version-1.3.0/deployment/install-engineconn.md | 4 ++--
versioned_docs/version-1.3.0/engine-usage/flink.md | 2 +-
versioned_docs/version-1.3.0/engine-usage/pipeline.md | 2 +-
versioned_docs/version-1.3.0/engine-usage/presto.md | 2 +-
versioned_docs/version-1.3.0/engine-usage/sqoop.md | 2 +-
versioned_docs/version-1.3.1/deployment/install-engineconn.md | 4 ++--
versioned_docs/version-1.3.2/deployment/install-engineconn.md | 2 +-
versioned_docs/version-1.4.0/deployment/install-engineconn.md | 4 ++--
53 files changed, 72 insertions(+), 67 deletions(-)
diff --git a/docs/deployment/install-engineconn.md
b/docs/deployment/install-engineconn.md
index 236757f125b..ef22b961967 100644
--- a/docs/deployment/install-engineconn.md
+++ b/docs/deployment/install-engineconn.md
@@ -78,7 +78,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/docs/engine-usage/hive.md b/docs/engine-usage/hive.md
index aac1f8d8556..93b82760211 100644
--- a/docs/engine-usage/hive.md
+++ b/docs/engine-usage/hive.md
@@ -38,9 +38,11 @@ default
The binary installation package released by `linkis` includes the `Hive`
engine plug-in by default, and users do not need to install it additionally.
-The version of `Hive` supports `hive1.x` and `hive2.x`. The default is to
support `hive on MapReduce`. If you want to change to `Hive on Tez`, you need
to modify it according to this `pr`.
+The version of `Hive` supports `hive1.x` and `hive2.x`. The default is to
support `hive on MapReduce`. If you want to change to `Hive on Tez`,Linkis is
compatible with hive on tez and requires the following steps:
+- You need to copy Tez-related dependencies to
{LINKIS_HOME}/lib/linkis-engineconn-plugins/hive/dist/3.1.3/lib is the dist not
plugin directory . You can also modify hive ec pom to add tez dependency compile
+- vim
{LINKIS_HOME}/lib/linkis-engineconn-plugins/hive/dist/3.1.3/conf/linkis-engineconn.properties
and update linkis.hive.engine.type=tez
+- sh linkis-daemon.sh restart linkis-cg-manager
-<https://github.com/apache/linkis/pull/541>
The `hive` version supported by default is 3.1.3, if you want to modify the
`hive` version, you can find the `linkis-engineplugin-hive` module, modify the
\<hive.version\> tag, and then compile this module separately Can
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/install-engineconn.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/install-engineconn.md
index 5dd98f4b268..f8f0ccb4e61 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/install-engineconn.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/deployment/install-engineconn.md
@@ -164,7 +164,7 @@ cd ${LINKIS_HOME}/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/hive.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/hive.md
index e5783c2f15e..4a5e724bd0d 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/hive.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/engine-usage/hive.md
@@ -38,9 +38,12 @@ default
`linkis` 发布的二进制安装包中默认包含了 `Hive` 引擎插件,用户无需额外安装。
-`Hive` 的版本是支持 `Hive1.x` 和` Hive2.x` ,默认是支持 `Hive on MapReduce` ,如果您想改成 `Hive
on Tez` ,需要您按照此 `PR` 进行一下修改。
+`Hive` 的版本是支持 `Hive1.x` 和` Hive2.x` ,默认是支持 `Hive on MapReduce` ,如果您想改成 `Hive
on Tez`,
+Linkis是可以兼容hive on tez的需要以下步骤:
+- 需要将Tez相关的依赖拷贝到
{LINKIS_HOME}/lib/linkis-engineconn-plugins/hive/dist/3.1.3/lib 是dist not
plugin 目录[1]. 你也可以修改hive ec pom增加tez dependency重新编译
+- vim
{LINKIS_HOME}/lib/linkis-engineconn-plugins/hive/dist/3.1.3/conf/linkis-engineconn.properties
and update linkis.hive.engine.type=tez
+- sh linkis-daemon.sh restart linkis-cg-manager
-<https://github.com/apache/linkis/pull/541>
默认支持的 `Hive` 版本是 3.1.3 ,如果您想修改 `Hive` 的版本,您可以找到 `linkis-engineplugin-hive`
模块,修改 `<hive.version\>` 标签,然后单独编译此模块即可
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
index 7705cb71cfc..9fab6ae3eb4 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
@@ -73,6 +73,6 @@ cd /Linkis1.0.0/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/elasticsearch.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/elasticsearch.md
index 7bf1a20b9b7..7275fdb07cb 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/elasticsearch.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/elasticsearch.md
@@ -36,7 +36,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin(或者通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/flink.md
index 8feb80cbc1a..0c9dd9a7547 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/flink.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/flink.md
@@ -55,7 +55,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
engineplugin更详细的介绍可以参看下面的文章。
https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/engine-conn-plugin-installation
@@ -82,7 +82,7 @@ Linkis的Flink引擎有两种执行方式,一种是ComputationEngineConn方式
FlinkSQL可以支持多种数据源,例如binlog,kafka,hive等,如果您想要在Flink代码中使用这些数据源,您需要将这些connector的插件jar包放置到flink引擎的lib中,并重启下Linkis的EnginePlugin服务。如你想要在您的FlinkSQL中使用binlog作为数据源,那么您需要将flink-connector-mysql-cdc-1.1.1.jar放置到flink引擎的lib中。
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn方式
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/jdbc.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/jdbc.md
index c0bbe487506..21fe5aa47d9 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/jdbc.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/jdbc.md
@@ -38,7 +38,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/openlookeng.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/openlookeng.md
index 4e6df359f34..0da1e43bef7 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/openlookeng.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/openlookeng.md
@@ -41,7 +41,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/pipeline.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/pipeline.md
index f9f1507439f..ac7336493ca 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/pipeline.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/pipeline.md
@@ -40,7 +40,7 @@
${linkis_code_dir}/linkis-engineconn-plugins/pipeline/target/out/pipeline
并重启linkis-engineplugin进行引擎刷新
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
检查引擎是否刷新成功:可以查看数据库中的linkis_engine_conn_plugin_bml_resources这张表的last_update_time是否为触发刷新的时间。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/presto.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/presto.md
index b65209a94ab..6dc9589f58d 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/presto.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/presto.md
@@ -37,7 +37,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
检查引擎是否刷新成功:可以查看数据库中的linkis_engine_conn_plugin_bml_resources这张表的last_update_time是否为触发刷新的时间。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/sqoop.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/sqoop.md
index 46dc7dec5ec..5eb8535711e 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/sqoop.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2.0/engine-usage/sqoop.md
@@ -68,7 +68,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
并重启linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
engineplugin更详细的介绍可以参看下面的文章。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/deployment/install-engineconn.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/deployment/install-engineconn.md
index 139ed57f2ce..8f410a67c9a 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/deployment/install-engineconn.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/deployment/install-engineconn.md
@@ -164,7 +164,7 @@ cd ${LINKIS_HOME}/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/elasticsearch.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/elasticsearch.md
index 7bf1a20b9b7..296d3c24878 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/elasticsearch.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/elasticsearch.md
@@ -33,10 +33,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/elasticsearch
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin(或者通过引擎接口进行刷新)
+并重启linkis-cg-linkismanager(或者通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/flink.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/flink.md
index c48e89586c8..34ed2d3e1fd 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/flink.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/flink.md
@@ -52,10 +52,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/flink/target/flink-engineconn.zip
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin
+并重启linkis-cg-linkismanager
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
engineplugin更详细的介绍可以参看下面的文章。
https://linkis.apache.org/zh-CN/docs/1.1.1/deployment/install-engineconn
@@ -82,7 +82,7 @@ Linkis的Flink引擎有两种执行方式,一种是ComputationEngineConn方式
FlinkSQL可以支持多种数据源,例如binlog,kafka,hive等,如果您想要在Flink代码中使用这些数据源,您需要将这些connector的插件jar包放置到flink引擎的lib中,并重启下Linkis的EnginePlugin服务。如你想要在您的FlinkSQL中使用binlog作为数据源,那么您需要将flink-connector-mysql-cdc-1.1.1.jar放置到flink引擎的lib中。
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn方式
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/jdbc.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/jdbc.md
index 92524b3d562..ffef4720ad0 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/jdbc.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/jdbc.md
@@ -35,10 +35,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/jdbc
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin(或则通过引擎接口进行刷新)
+并重启linkis-cg-linkismanager(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/openlookeng.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/openlookeng.md
index 294b55213a1..ef6bc6c2370 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/openlookeng.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/openlookeng.md
@@ -38,10 +38,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/openlookeng/target/out/openlookeng
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin(或则通过引擎接口进行刷新)
+并重启linkis-cg-linkismanager(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 2.3 引擎的标签
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/pipeline.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/pipeline.md
index 96198c548e1..6a4d94ce764 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/pipeline.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/pipeline.md
@@ -37,10 +37,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/pipeline/target/out/pipeline
将 步骤 1.1获取到的引擎物料包,上传到服务器的引擎目录下`${LINKIS_HOME}/lib/linkis-engineplugins`
-并重启linkis-engineplugin进行引擎刷新
+并重启linkis-cg-linkismanager进行引擎刷新
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
检查引擎是否刷新成功:可以查看数据库中的linkis_engine_conn_plugin_bml_resources这张表的last_update_time是否为触发刷新的时间。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/presto.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/presto.md
index b65209a94ab..0d60fc71620 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/presto.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/presto.md
@@ -34,10 +34,10 @@
${linkis_code_dir}/linkis-engineconn-plugins/jdbc/target/out/presto
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin(或则通过引擎接口进行刷新)
+并重启linkis-cg-linkismanager(或则通过引擎接口进行刷新)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
检查引擎是否刷新成功:可以查看数据库中的linkis_engine_conn_plugin_bml_resources这张表的last_update_time是否为触发刷新的时间。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/sqoop.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/sqoop.md
index 88b0104fca6..bd6094d089c 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/sqoop.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.0/engine-usage/sqoop.md
@@ -65,10 +65,10 @@
${linkis_code_dir}/linkis-enginepconn-plugins/engineconn-plugins/sqoop/target/sq
```bash
${LINKIS_HOME}/lib/linkis-engineplugins
```
-并重启linkis-engineplugin
+并重启linkis-cg-linkismanager
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
engineplugin更详细的介绍可以参看下面的文章。
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.1/deployment/install-engineconn.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.1/deployment/install-engineconn.md
index 0622eff2e71..074535a8d26 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.1/deployment/install-engineconn.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.1/deployment/install-engineconn.md
@@ -164,7 +164,7 @@ cd ${LINKIS_HOME}/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.2/deployment/install-engineconn.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.2/deployment/install-engineconn.md
index 5dd98f4b268..f8f0ccb4e61 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.2/deployment/install-engineconn.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.3.2/deployment/install-engineconn.md
@@ -164,7 +164,7 @@ cd ${LINKIS_HOME}/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.4.0/deployment/install-engineconn.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.4.0/deployment/install-engineconn.md
index 5dd98f4b268..f8f0ccb4e61 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.4.0/deployment/install-engineconn.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.4.0/deployment/install-engineconn.md
@@ -164,7 +164,7 @@ cd ${LINKIS_HOME}/sbin
## 执行linkis-daemon脚本
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
diff --git
a/versioned_docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
index 39bbc89f85e..efd6aa56583 100644
---
a/versioned_docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
+++
b/versioned_docs/version-0.11.0/deployment/engine-conn-plugin-installation.md
@@ -73,5 +73,5 @@ cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
\ No newline at end of file
diff --git
a/versioned_docs/version-1.0.2/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.0.2/deployment/engine-conn-plugin-installation.md
index f134b313165..3b51f984a7f 100644
--- a/versioned_docs/version-1.0.2/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.0.2/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git
a/versioned_docs/version-1.0.3/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.0.3/deployment/engine-conn-plugin-installation.md
index 21b221d9366..8b049e18aa7 100644
--- a/versioned_docs/version-1.0.3/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.0.3/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.0.3/engine-usage/flink.md
b/versioned_docs/version-1.0.3/engine-usage/flink.md
index ec036c38577..ec37fa07a78 100644
--- a/versioned_docs/version-1.0.3/engine-usage/flink.md
+++ b/versioned_docs/version-1.0.3/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git
a/versioned_docs/version-1.1.0/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.1.0/deployment/engine-conn-plugin-installation.md
index 21b221d9366..8b049e18aa7 100644
--- a/versioned_docs/version-1.1.0/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.1.0/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.1.0/engine-usage/flink.md
b/versioned_docs/version-1.1.0/engine-usage/flink.md
index ec036c38577..ec37fa07a78 100644
--- a/versioned_docs/version-1.1.0/engine-usage/flink.md
+++ b/versioned_docs/version-1.1.0/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git
a/versioned_docs/version-1.1.1/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.1.1/deployment/engine-conn-plugin-installation.md
index 21b221d9366..8b049e18aa7 100644
--- a/versioned_docs/version-1.1.1/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.1.1/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.1.1/engine-usage/flink.md
b/versioned_docs/version-1.1.1/engine-usage/flink.md
index ec036c38577..ec37fa07a78 100644
--- a/versioned_docs/version-1.1.1/engine-usage/flink.md
+++ b/versioned_docs/version-1.1.1/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git a/versioned_docs/version-1.1.1/engine-usage/pipeline.md
b/versioned_docs/version-1.1.1/engine-usage/pipeline.md
index 1a13c4feb0a..853a6db17b1 100644
--- a/versioned_docs/version-1.1.1/engine-usage/pipeline.md
+++ b/versioned_docs/version-1.1.1/engine-usage/pipeline.md
@@ -39,7 +39,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart the `linkis engineplugin` to refresh the engine
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Or refresh through the engine interface. After the engine is placed in the
corresponding directory, send a refresh request to the `linkis CG
engineconplugin service` through the HTTP interface.
-
Interface`http://${engineconn-plugin-server-IP}:${port}/api/rest_j/v1/rpc/receiveAndReply`
diff --git
a/versioned_docs/version-1.1.2/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.1.2/deployment/engine-conn-plugin-installation.md
index 21b221d9366..8b049e18aa7 100644
--- a/versioned_docs/version-1.1.2/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.1.2/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.1.2/engine-usage/flink.md
b/versioned_docs/version-1.1.2/engine-usage/flink.md
index ec036c38577..ec37fa07a78 100644
--- a/versioned_docs/version-1.1.2/engine-usage/flink.md
+++ b/versioned_docs/version-1.1.2/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git a/versioned_docs/version-1.1.2/engine-usage/pipeline.md
b/versioned_docs/version-1.1.2/engine-usage/pipeline.md
index 33b7a39aa87..1bced7179be 100644
--- a/versioned_docs/version-1.1.2/engine-usage/pipeline.md
+++ b/versioned_docs/version-1.1.2/engine-usage/pipeline.md
@@ -39,7 +39,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart the `linkis engineplugin` to refresh the engine
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Or refresh through the engine interface. After the engine is placed in the
corresponding directory, send a refresh request to the `linkis CG
engineconplugin service` through the HTTP interface.
-
Interface`http://${engineconn-plugin-server-IP}:${port}/api/rest_j/v1/rpc/receiveAndReply`
diff --git a/versioned_docs/version-1.1.2/engine-usage/sqoop.md
b/versioned_docs/version-1.1.2/engine-usage/sqoop.md
index b7cfce1c89c..805b79f4fcb 100644
--- a/versioned_docs/version-1.1.2/engine-usage/sqoop.md
+++ b/versioned_docs/version-1.1.2/engine-usage/sqoop.md
@@ -57,7 +57,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
and restart linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
More engineplugin details can be found in the following article.
https://linkis.apache.org/docs/1.1.1/deployment/engine-conn-plugin-installation
diff --git
a/versioned_docs/version-1.1.3/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.1.3/deployment/engine-conn-plugin-installation.md
index 21b221d9366..8b049e18aa7 100644
--- a/versioned_docs/version-1.1.3/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.1.3/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.1.3/engine-usage/flink.md
b/versioned_docs/version-1.1.3/engine-usage/flink.md
index ec036c38577..ec37fa07a78 100644
--- a/versioned_docs/version-1.1.3/engine-usage/flink.md
+++ b/versioned_docs/version-1.1.3/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git a/versioned_docs/version-1.1.3/engine-usage/pipeline.md
b/versioned_docs/version-1.1.3/engine-usage/pipeline.md
index 5818270e7f9..9f76e3704e3 100644
--- a/versioned_docs/version-1.1.3/engine-usage/pipeline.md
+++ b/versioned_docs/version-1.1.3/engine-usage/pipeline.md
@@ -39,7 +39,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart the `linkis engineplugin` to refresh the engine
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Or refresh through the engine interface. After the engine is placed in the
corresponding directory, send a refresh request to the `linkis CG
engineconplugin service` through the HTTP interface.
-
Interface`http://${engineconn-plugin-server-IP}:${port}/api/rest_j/v1/rpc/receiveAndReply`
diff --git a/versioned_docs/version-1.1.3/engine-usage/sqoop.md
b/versioned_docs/version-1.1.3/engine-usage/sqoop.md
index b7cfce1c89c..805b79f4fcb 100644
--- a/versioned_docs/version-1.1.3/engine-usage/sqoop.md
+++ b/versioned_docs/version-1.1.3/engine-usage/sqoop.md
@@ -57,7 +57,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
and restart linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
More engineplugin details can be found in the following article.
https://linkis.apache.org/docs/1.1.1/deployment/engine-conn-plugin-installation
diff --git
a/versioned_docs/version-1.2.0/deployment/engine-conn-plugin-installation.md
b/versioned_docs/version-1.2.0/deployment/engine-conn-plugin-installation.md
index 94e6e713572..6f373281fea 100644
--- a/versioned_docs/version-1.2.0/deployment/engine-conn-plugin-installation.md
+++ b/versioned_docs/version-1.2.0/deployment/engine-conn-plugin-installation.md
@@ -81,7 +81,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.2.0/engine-usage/flink.md
b/versioned_docs/version-1.2.0/engine-usage/flink.md
index b9ffd1b93d6..719f6099f92 100644
--- a/versioned_docs/version-1.2.0/engine-usage/flink.md
+++ b/versioned_docs/version-1.2.0/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git a/versioned_docs/version-1.2.0/engine-usage/pipeline.md
b/versioned_docs/version-1.2.0/engine-usage/pipeline.md
index 85ff1f2e0e4..dcf18b14695 100644
--- a/versioned_docs/version-1.2.0/engine-usage/pipeline.md
+++ b/versioned_docs/version-1.2.0/engine-usage/pipeline.md
@@ -39,7 +39,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart the `linkis engineplugin` to refresh the engine
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Or refresh through the engine interface. After the engine is placed in the
corresponding directory, send a refresh request to the `linkis CG
engineconplugin service` through the HTTP interface.
-
Interface`http://${engineconn-plugin-server-IP}:${port}/api/rest_j/v1/rpc/receiveAndReply`
diff --git a/versioned_docs/version-1.2.0/engine-usage/presto.md
b/versioned_docs/version-1.2.0/engine-usage/presto.md
index 03916b1059e..1d91307b4a8 100644
--- a/versioned_docs/version-1.2.0/engine-usage/presto.md
+++ b/versioned_docs/version-1.2.0/engine-usage/presto.md
@@ -38,7 +38,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart linkis-engineplugin (or refresh through the engine interface)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Check whether the engine refresh is successful: You can check whether the
last_update_time of the linkis_engine_conn_plugin_bml_resources table in the
database is the time when the refresh is triggered.
diff --git a/versioned_docs/version-1.2.0/engine-usage/sqoop.md
b/versioned_docs/version-1.2.0/engine-usage/sqoop.md
index 9d8086bfde3..b5f73346250 100644
--- a/versioned_docs/version-1.2.0/engine-usage/sqoop.md
+++ b/versioned_docs/version-1.2.0/engine-usage/sqoop.md
@@ -57,7 +57,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
and restart linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
More engineplugin details can be found in the following article.
https://linkis.apache.org/docs/1.1.1/deployment/engine-conn-plugin-installation
diff --git a/versioned_docs/version-1.3.0/deployment/install-engineconn.md
b/versioned_docs/version-1.3.0/deployment/install-engineconn.md
index 6f483bdd13f..36f17477a29 100644
--- a/versioned_docs/version-1.3.0/deployment/install-engineconn.md
+++ b/versioned_docs/version-1.3.0/deployment/install-engineconn.md
@@ -75,10 +75,10 @@ If it is an existing engine and a new version is added, you
can modify the versi
2. Restart refresh: the engine catalog can be forced to refresh by
restarting
```
-### cd to the sbin directory, restart linkis-engineconn-plugin-server
+### cd to the sbin directory, restart linkis-cg-linkismanager
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.3.0/engine-usage/flink.md
b/versioned_docs/version-1.3.0/engine-usage/flink.md
index 7c3613e8a07..33cc44a54e7 100644
--- a/versioned_docs/version-1.3.0/engine-usage/flink.md
+++ b/versioned_docs/version-1.3.0/engine-usage/flink.md
@@ -81,7 +81,7 @@ Linkis’ Flink engine has two execution methods. One is the
ComputationEngineCo
FlinkSQL can support a variety of data sources, such as binlog, kafka, hive,
etc. If you want to use these data sources in Flink code, you need to put the
plug-in jar packages of these connectors into the lib of the flink engine, and
restart Linkis EnginePlugin service. If you want to use binlog as a data source
in your FlinkSQL, then you need to put flink-connector-mysql-cdc-1.1.1.jar into
the lib of the flink engine.
```bash
cd ${LINKS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
### 3.1 ComputationEngineConn method
diff --git a/versioned_docs/version-1.3.0/engine-usage/pipeline.md
b/versioned_docs/version-1.3.0/engine-usage/pipeline.md
index 8fe7fb9a8a2..6c6d79c26b5 100644
--- a/versioned_docs/version-1.3.0/engine-usage/pipeline.md
+++ b/versioned_docs/version-1.3.0/engine-usage/pipeline.md
@@ -39,7 +39,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart the `linkis engineplugin` to refresh the engine
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Or refresh through the engine interface. After the engine is placed in the
corresponding directory, send a refresh request to the `linkis CG
engineconplugin service` through the HTTP interface.
-
Interface`http://${engineconn-plugin-server-IP}:${port}/api/rest_j/v1/rpc/receiveAndReply`
diff --git a/versioned_docs/version-1.3.0/engine-usage/presto.md
b/versioned_docs/version-1.3.0/engine-usage/presto.md
index 03916b1059e..1d91307b4a8 100644
--- a/versioned_docs/version-1.3.0/engine-usage/presto.md
+++ b/versioned_docs/version-1.3.0/engine-usage/presto.md
@@ -38,7 +38,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
And restart linkis-engineplugin (or refresh through the engine interface)
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
Check whether the engine refresh is successful: You can check whether the
last_update_time of the linkis_engine_conn_plugin_bml_resources table in the
database is the time when the refresh is triggered.
diff --git a/versioned_docs/version-1.3.0/engine-usage/sqoop.md
b/versioned_docs/version-1.3.0/engine-usage/sqoop.md
index c117e47ce47..cd8c77ece41 100644
--- a/versioned_docs/version-1.3.0/engine-usage/sqoop.md
+++ b/versioned_docs/version-1.3.0/engine-usage/sqoop.md
@@ -57,7 +57,7 @@ ${LINKIS_HOME}/lib/linkis-engineplugins
and restart linkis-engineplugin
```bash
cd ${LINKIS_HOME}/sbin
-sh linkis-daemon.sh restart cg-engineplugin
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
More engineplugin details can be found in the following article
[EngineConnPlugin Installation](../deployment/install-engineconn)
diff --git a/versioned_docs/version-1.3.1/deployment/install-engineconn.md
b/versioned_docs/version-1.3.1/deployment/install-engineconn.md
index 5bf47336b9c..c3c275ac4db 100644
--- a/versioned_docs/version-1.3.1/deployment/install-engineconn.md
+++ b/versioned_docs/version-1.3.1/deployment/install-engineconn.md
@@ -75,10 +75,10 @@ If it is an existing engine and a new version is added, you
can modify the versi
2. Restart refresh: the engine catalog can be forced to refresh by
restarting
```
-### cd to the sbin directory, restart linkis-engineconn-plugin-server
+### cd to the sbin directory, restart linkis-cg-linkismanager
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.3.2/deployment/install-engineconn.md
b/versioned_docs/version-1.3.2/deployment/install-engineconn.md
index 236757f125b..ef22b961967 100644
--- a/versioned_docs/version-1.3.2/deployment/install-engineconn.md
+++ b/versioned_docs/version-1.3.2/deployment/install-engineconn.md
@@ -78,7 +78,7 @@ If it is an existing engine and a new version is added, you
can modify the versi
### cd to the sbin directory, restart linkis-engineconn-plugin-server
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
diff --git a/versioned_docs/version-1.4.0/deployment/install-engineconn.md
b/versioned_docs/version-1.4.0/deployment/install-engineconn.md
index 236757f125b..7775d629a4c 100644
--- a/versioned_docs/version-1.4.0/deployment/install-engineconn.md
+++ b/versioned_docs/version-1.4.0/deployment/install-engineconn.md
@@ -75,10 +75,10 @@ If it is an existing engine and a new version is added, you
can modify the versi
2. Restart refresh: the engine catalog can be forced to refresh by
restarting
```
-### cd to the sbin directory, restart linkis-engineconn-plugin-server
+### cd to the sbin directory, restart linkis-cg-linkismanager
cd /Linkis1.0.0/sbin
## Execute linkis-daemon script
-sh linkis-daemon.sh restart linkis-engine-plugin-server
+sh linkis-daemon.sh restart linkis-cg-linkismanager
```
3.Check whether the engine refresh is successful: If you encounter problems
during the refresh process and need to confirm whether the refresh is
successful, you can check whether the last_update_time of the
linkis_engine_conn_plugin_bml_resources table in the database is the time when
the refresh is triggered.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]