This is an automated email from the ASF dual-hosted git repository.

trohrmann pushed a commit to branch release-1.12
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 586f2b94704ec9248b8e8bbeae38ca9ce124727c
Author: Till Rohrmann <[email protected]>
AuthorDate: Wed Nov 25 11:03:43 2020 +0100

    [FLINK-20342][docs] Move cli from ops to deployment
---
 docs/{ops => deployment}/cli.md                                     | 6 +++---
 docs/{ops => deployment}/cli.zh.md                                  | 6 +++---
 docs/deployment/resource-providers/docker.md                        | 4 ++--
 docs/deployment/resource-providers/docker.zh.md                     | 4 ++--
 docs/deployment/resource-providers/yarn_setup.md                    | 2 +-
 docs/deployment/resource-providers/yarn_setup.zh.md                 | 2 +-
 docs/dev/cluster_execution.md                                       | 2 +-
 docs/dev/cluster_execution.zh.md                                    | 2 +-
 docs/dev/datastream_api.md                                          | 4 ++--
 docs/dev/datastream_api.zh.md                                       | 4 ++--
 docs/dev/local_execution.md                                         | 2 +-
 docs/dev/local_execution.zh.md                                      | 2 +-
 docs/dev/packaging.md                                               | 2 +-
 docs/dev/packaging.zh.md                                            | 2 +-
 docs/dev/python/datastream-api-users-guide/dependency_management.md | 4 ++--
 .../python/datastream-api-users-guide/dependency_management.zh.md   | 4 ++--
 docs/dev/python/datastream_tutorial.md                              | 2 +-
 docs/dev/python/datastream_tutorial.zh.md                           | 2 +-
 docs/dev/python/faq.md                                              | 2 +-
 docs/dev/python/faq.zh.md                                           | 2 +-
 docs/dev/python/table-api-users-guide/dependency_management.md      | 6 +++---
 docs/dev/python/table-api-users-guide/dependency_management.zh.md   | 6 +++---
 docs/dev/python/table_api_tutorial.md                               | 2 +-
 docs/dev/python/table_api_tutorial.zh.md                            | 2 +-
 docs/dev/table/sqlClient.md                                         | 2 +-
 docs/dev/table/sqlClient.zh.md                                      | 2 +-
 docs/index.md                                                       | 2 +-
 docs/index.zh.md                                                    | 2 +-
 docs/ops/state/checkpoints.md                                       | 2 +-
 docs/ops/state/checkpoints.zh.md                                    | 2 +-
 docs/ops/state/savepoints.md                                        | 2 +-
 docs/ops/state/savepoints.zh.md                                     | 2 +-
 docs/redirects/cli.md                                               | 2 +-
 docs/try-flink/flink-operations-playground.md                       | 2 +-
 docs/try-flink/flink-operations-playground.zh.md                    | 2 +-
 35 files changed, 49 insertions(+), 49 deletions(-)

diff --git a/docs/ops/cli.md b/docs/deployment/cli.md
similarity index 99%
rename from docs/ops/cli.md
rename to docs/deployment/cli.md
index 3632ad5..08bb2be 100644
--- a/docs/ops/cli.md
+++ b/docs/deployment/cli.md
@@ -1,8 +1,8 @@
 ---
 title:  "Command-Line Interface"
-nav-title: CLI
-nav-parent_id: ops
-nav-pos: 1
+nav-title: Command-Line Interface
+nav-parent_id: deployment
+nav-pos: 2
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
diff --git a/docs/ops/cli.zh.md b/docs/deployment/cli.zh.md
similarity index 99%
rename from docs/ops/cli.zh.md
rename to docs/deployment/cli.zh.md
index 7aabec5..eaa86bb 100644
--- a/docs/ops/cli.zh.md
+++ b/docs/deployment/cli.zh.md
@@ -1,8 +1,8 @@
 ---
 title:  "命令行界面"
-nav-title: CLI
-nav-parent_id: ops
-nav-pos: 1
+nav-title: Command-Line Interface
+nav-parent_id: deployment
+nav-pos: 2
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
diff --git a/docs/deployment/resource-providers/docker.md 
b/docs/deployment/resource-providers/docker.md
index 01e1c08..1326b57 100644
--- a/docs/deployment/resource-providers/docker.md
+++ b/docs/deployment/resource-providers/docker.md
@@ -395,13 +395,13 @@ The next chapters show examples of configuration files to 
run Flink.
 
 * To submit a job to a *Session cluster* via the command line, you can either
 
-  * use [Flink CLI]({% link ops/cli.md %}) on the host if it is installed:
+  * use [Flink CLI]({% link deployment/cli.md %}) on the host if it is 
installed:
 
     ```sh
     flink run -d -c ${JOB_CLASS_NAME} /job.jar
     ```
 
-  * or copy the JAR to the *JobManager* container and submit the job using the 
[CLI]({% link ops/cli.md %}) from there, for example:
+  * or copy the JAR to the *JobManager* container and submit the job using the 
[CLI]({% link deployment/cli.md %}) from there, for example:
 
     ```sh
     JOB_CLASS_NAME="com.job.ClassName"
diff --git a/docs/deployment/resource-providers/docker.zh.md 
b/docs/deployment/resource-providers/docker.zh.md
index 5cc3573..9904a74 100644
--- a/docs/deployment/resource-providers/docker.zh.md
+++ b/docs/deployment/resource-providers/docker.zh.md
@@ -395,13 +395,13 @@ The next chapters show examples of configuration files to 
run Flink.
 
 * To submit a job to a *Session cluster* via the command line, you can either
 
-  * use [Flink CLI]({% link ops/cli.zh.md %}) on the host if it is installed:
+  * use [Flink CLI]({% link deployment/cli.zh.md %}) on the host if it is 
installed:
 
     ```sh
     flink run -d -c ${JOB_CLASS_NAME} /job.jar
     ```
 
-  * or copy the JAR to the *JobManager* container and submit the job using the 
[CLI]({% link ops/cli.zh.md %}) from there, for example:
+  * or copy the JAR to the *JobManager* container and submit the job using the 
[CLI]({% link deployment/cli.zh.md %}) from there, for example:
 
     ```sh
     JOB_CLASS_NAME="com.job.ClassName"
diff --git a/docs/deployment/resource-providers/yarn_setup.md 
b/docs/deployment/resource-providers/yarn_setup.md
index cba5d92..4fffeb9 100644
--- a/docs/deployment/resource-providers/yarn_setup.md
+++ b/docs/deployment/resource-providers/yarn_setup.md
@@ -176,7 +176,7 @@ Use the following command to submit a Flink program to the 
YARN cluster:
 ./bin/flink
 {% endhighlight %}
 
-Please refer to the documentation of the [command-line client]({% link 
ops/cli.md %}).
+Please refer to the documentation of the [command-line client]({% link 
deployment/cli.md %}).
 
 The command will show you a help menu like this:
 
diff --git a/docs/deployment/resource-providers/yarn_setup.zh.md 
b/docs/deployment/resource-providers/yarn_setup.zh.md
index 1e29ba5..016fc2d 100644
--- a/docs/deployment/resource-providers/yarn_setup.zh.md
+++ b/docs/deployment/resource-providers/yarn_setup.zh.md
@@ -176,7 +176,7 @@ Use the following command to submit a Flink program to the 
YARN cluster:
 ./bin/flink
 {% endhighlight %}
 
-Please refer to the documentation of the [command-line client]({% link 
ops/cli.zh.md %}).
+Please refer to the documentation of the [command-line client]({% link 
deployment/cli.zh.md %}).
 
 The command will show you a help menu like this:
 
diff --git a/docs/dev/cluster_execution.md b/docs/dev/cluster_execution.md
index 8d5d437..c2c135e 100644
--- a/docs/dev/cluster_execution.md
+++ b/docs/dev/cluster_execution.md
@@ -33,7 +33,7 @@ are two ways to send a program to a cluster for execution:
 The command line interface lets you submit packaged programs (JARs) to a 
cluster
 (or single machine setup).
 
-Please refer to the [Command Line Interface]({{ site.baseurl }}/ops/cli.html) 
documentation for
+Please refer to the [Command Line Interface]({{ site.baseurl 
}}/deployment/cli.html) documentation for
 details.
 
 ## Remote Environment
diff --git a/docs/dev/cluster_execution.zh.md b/docs/dev/cluster_execution.zh.md
index c4b4ae2..e772514 100644
--- a/docs/dev/cluster_execution.zh.md
+++ b/docs/dev/cluster_execution.zh.md
@@ -33,7 +33,7 @@ Flink 程序可以分布式运行在多机器集群上。有两种方式可以
 
 命令行界面使你可以将打包的程序(JARs)提交到集群(或单机设置)。
 
-有关详细信息,请参阅[命令行界面]({% link ops/cli.zh.md %})文档。
+有关详细信息,请参阅[命令行界面]({% link deployment/cli.zh.md %})文档。
 
 <a name="remote-environment"></a>
 
diff --git a/docs/dev/datastream_api.md b/docs/dev/datastream_api.md
index 6b5deb0..efc667f 100644
--- a/docs/dev/datastream_api.md
+++ b/docs/dev/datastream_api.md
@@ -100,7 +100,7 @@ the right thing depending on the context: if you are 
executing your program
 inside an IDE or as a regular Java program it will create a local environment
 that will execute your program on your local machine. If you created a JAR file
 from your program, and invoke it through the [command line]({{ site.baseurl
-}}/ops/cli.html), the Flink cluster manager will execute your main method and
+}}/deployment/cli.html), the Flink cluster manager will execute your main 
method and
 `getExecutionEnvironment()` will return an execution environment for executing
 your program on a cluster.
 
@@ -170,7 +170,7 @@ the right thing depending on the context: if you are 
executing your program
 inside an IDE or as a regular Java program it will create a local environment
 that will execute your program on your local machine. If you created a JAR file
 from your program, and invoke it through the [command line]({{ site.baseurl
-}}/ops/cli.html), the Flink cluster manager will execute your main method and
+}}/deployment/cli.html), the Flink cluster manager will execute your main 
method and
 `getExecutionEnvironment()` will return an execution environment for executing
 your program on a cluster.
 
diff --git a/docs/dev/datastream_api.zh.md b/docs/dev/datastream_api.zh.md
index 2294959..f93fee8 100644
--- a/docs/dev/datastream_api.zh.md
+++ b/docs/dev/datastream_api.zh.md
@@ -100,7 +100,7 @@ the right thing depending on the context: if you are 
executing your program
 inside an IDE or as a regular Java program it will create a local environment
 that will execute your program on your local machine. If you created a JAR file
 from your program, and invoke it through the [command line]({{ site.baseurl
-}}/ops/cli.html), the Flink cluster manager will execute your main method and
+}}/deployment/cli.html), the Flink cluster manager will execute your main 
method and
 `getExecutionEnvironment()` will return an execution environment for executing
 your program on a cluster.
 
@@ -170,7 +170,7 @@ the right thing depending on the context: if you are 
executing your program
 inside an IDE or as a regular Java program it will create a local environment
 that will execute your program on your local machine. If you created a JAR file
 from your program, and invoke it through the [command line]({{ site.baseurl
-}}/ops/cli.html), the Flink cluster manager will execute your main method and
+}}/deployment/cli.html), the Flink cluster manager will execute your main 
method and
 `getExecutionEnvironment()` will return an execution environment for executing
 your program on a cluster.
 
diff --git a/docs/dev/local_execution.md b/docs/dev/local_execution.md
index d03029b..1a9804a 100644
--- a/docs/dev/local_execution.md
+++ b/docs/dev/local_execution.md
@@ -57,7 +57,7 @@ The `LocalEnvironment` is a handle to local execution for 
Flink programs. Use it
 
 The local environment is instantiated via the method 
`ExecutionEnvironment.createLocalEnvironment()`. By default, it will use as 
many local threads for execution as your machine has CPU cores (hardware 
contexts). You can alternatively specify the desired parallelism. The local 
environment can be configured to log to the console using 
`enableLogging()`/`disableLogging()`.
 
-In most cases, calling `ExecutionEnvironment.getExecutionEnvironment()` is the 
even better way to go. That method returns a `LocalEnvironment` when the 
program is started locally (outside the command line interface), and it returns 
a pre-configured environment for cluster execution, when the program is invoked 
by the [command line interface]({{ site.baseurl }}/ops/cli.html).
+In most cases, calling `ExecutionEnvironment.getExecutionEnvironment()` is the 
even better way to go. That method returns a `LocalEnvironment` when the 
program is started locally (outside the command line interface), and it returns 
a pre-configured environment for cluster execution, when the program is invoked 
by the [command line interface]({{ site.baseurl }}/deployment/cli.html).
 
 {% highlight java %}
 public static void main(String[] args) throws Exception {
diff --git a/docs/dev/local_execution.zh.md b/docs/dev/local_execution.zh.md
index 22326ba..1a25a00 100644
--- a/docs/dev/local_execution.zh.md
+++ b/docs/dev/local_execution.zh.md
@@ -57,7 +57,7 @@ The `LocalEnvironment` is a handle to local execution for 
Flink programs. Use it
 
 The local environment is instantiated via the method 
`ExecutionEnvironment.createLocalEnvironment()`. By default, it will use as 
many local threads for execution as your machine has CPU cores (hardware 
contexts). You can alternatively specify the desired parallelism. The local 
environment can be configured to log to the console using 
`enableLogging()`/`disableLogging()`.
 
-In most cases, calling `ExecutionEnvironment.getExecutionEnvironment()` is the 
even better way to go. That method returns a `LocalEnvironment` when the 
program is started locally (outside the command line interface), and it returns 
a pre-configured environment for cluster execution, when the program is invoked 
by the [command line interface]({{ site.baseurl }}/ops/cli.html).
+In most cases, calling `ExecutionEnvironment.getExecutionEnvironment()` is the 
even better way to go. That method returns a `LocalEnvironment` when the 
program is started locally (outside the command line interface), and it returns 
a pre-configured environment for cluster execution, when the program is invoked 
by the [command line interface]({{ site.baseurl }}/deployment/cli.html).
 
 {% highlight java %}
 public static void main(String[] args) throws Exception {
diff --git a/docs/dev/packaging.md b/docs/dev/packaging.md
index 8e09392..d538781 100644
--- a/docs/dev/packaging.md
+++ b/docs/dev/packaging.md
@@ -27,7 +27,7 @@ under the License.
 As described earlier, Flink programs can be executed on
 clusters by using a `remote environment`. Alternatively, programs can be 
packaged into JAR Files
 (Java Archives) for execution. Packaging the program is a prerequisite to 
executing them through the
-[command line interface]({{ site.baseurl }}/ops/cli.html).
+[command line interface]({{ site.baseurl }}/deployment/cli.html).
 
 ### Packaging Programs
 
diff --git a/docs/dev/packaging.zh.md b/docs/dev/packaging.zh.md
index 1501eb0..357b979 100644
--- a/docs/dev/packaging.zh.md
+++ b/docs/dev/packaging.zh.md
@@ -24,7 +24,7 @@ under the License.
 -->
 
 
-正如之前所描述的,Flink 程序可以使用 `remote environment` 在集群上执行。或者,程序可以被打包成 JAR 文件(Java 
Archives)执行。如果使用[命令行]({% link ops/cli.zh.md %})的方式执行程序,将程序打包是必需的。
+正如之前所描述的,Flink 程序可以使用 `remote environment` 在集群上执行。或者,程序可以被打包成 JAR 文件(Java 
Archives)执行。如果使用[命令行]({% link deployment/cli.zh.md %})的方式执行程序,将程序打包是必需的。
 
 <a name="packaging-programs"></a>
 
diff --git 
a/docs/dev/python/datastream-api-users-guide/dependency_management.md 
b/docs/dev/python/datastream-api-users-guide/dependency_management.md
index 6c948bd..f14c71b 100644
--- a/docs/dev/python/datastream-api-users-guide/dependency_management.md
+++ b/docs/dev/python/datastream-api-users-guide/dependency_management.md
@@ -25,7 +25,7 @@ under the License.
 # Java Dependency
 
 If third-party Java dependencies are used, you can specify the dependencies 
with the following Python DataStream APIs or 
-through [command line arguments]({% link ops/cli.md %}#usage) directly when 
submitting the job.
+through [command line arguments]({% link deployment/cli.md %}#usage) directly 
when submitting the job.
 
 {% highlight python %}
 # Use the add_jars() to add local jars and the jars will be uploaded to the 
cluster.
@@ -41,7 +41,7 @@ 
stream_execution_environment.add_classpaths("file:///my/jar/path/connector.jar",
 # Python Dependency
 
 If third-party Python dependencies are used, you can specify the dependencies 
with the following Python DataStream 
-APIs or through [command line arguments]({% link ops/cli.md %}#usage) directly 
when submitting the job.
+APIs or through [command line arguments]({% link deployment/cli.md %}#usage) 
directly when submitting the job.
 
 <table class="table table-bordered">
   <thead>
diff --git 
a/docs/dev/python/datastream-api-users-guide/dependency_management.zh.md 
b/docs/dev/python/datastream-api-users-guide/dependency_management.zh.md
index 5d37025..2941b97 100644
--- a/docs/dev/python/datastream-api-users-guide/dependency_management.zh.md
+++ b/docs/dev/python/datastream-api-users-guide/dependency_management.zh.md
@@ -26,7 +26,7 @@ under the License.
 
 # Java 依赖管理
 
-如果应用了第三方 Java 依赖, 用户可以通过以下 Python DataStream API进行配置,或者在提交作业时直接通过[命令行参数]({% 
link ops/cli.zh.md %}#usage)配置。
+如果应用了第三方 Java 依赖, 用户可以通过以下 Python DataStream API进行配置,或者在提交作业时直接通过[命令行参数]({% 
link deployment/cli.zh.md %}#usage)配置。
 
 {% highlight python %}
 # 通过 add_jars() 添加本地 jar 包依赖,这些 jar 包最终会被上传到集群中。
@@ -42,7 +42,7 @@ 
stream_execution_environment.add_classpaths("file:///my/jar/path/connector.jar",
 <a name="python-dependency-in-python-program"/>
 
 # Python 依赖管理
-如果 Python DataStream 程序中应用到了 Python 第三方依赖,用户可以使用以下 API 
配置依赖信息,或在提交作业时直接通过[命令行参数]({% link ops/cli.zh.md %}#usage)配置。
+如果 Python DataStream 程序中应用到了 Python 第三方依赖,用户可以使用以下 API 
配置依赖信息,或在提交作业时直接通过[命令行参数]({% link deployment/cli.zh.md %}#usage)配置。
 
 <table class="table table-bordered">
   <thead>
diff --git a/docs/dev/python/datastream_tutorial.md 
b/docs/dev/python/datastream_tutorial.md
index c0911cf..61d3698 100644
--- a/docs/dev/python/datastream_tutorial.md
+++ b/docs/dev/python/datastream_tutorial.md
@@ -130,7 +130,7 @@ Next, you can run the example you just created on the 
command line:
 $ python datastream_tutorial.py
 {% endhighlight %}
 
-The command builds and runs your PyFlink program in a local mini cluster. You 
can alternatively submit it to a remote cluster using the instructions detailed 
in [Job Submission Examples]({{ site.baseurl 
}}/ops/cli.html#job-submission-examples).
+The command builds and runs your PyFlink program in a local mini cluster. You 
can alternatively submit it to a remote cluster using the instructions detailed 
in [Job Submission Examples]({{ site.baseurl 
}}/deployment/cli.html#job-submission-examples).
 
 Finally, you can see the execution result on the command line:
 
diff --git a/docs/dev/python/datastream_tutorial.zh.md 
b/docs/dev/python/datastream_tutorial.zh.md
index bb621c6..e42f47e 100644
--- a/docs/dev/python/datastream_tutorial.zh.md
+++ b/docs/dev/python/datastream_tutorial.zh.md
@@ -131,7 +131,7 @@ rm -rf /tmp/output
 $ python datastream_tutorial.py
 {% endhighlight %}
 
-这个命令会在本地集群中构建并运行 PyFlink 程序。你也可以使用 [Job Submission Examples]({{ site.baseurl 
}}/zh/ops/cli.html#job-submission-examples) 中描述的命令将其提交到远程集群。
+这个命令会在本地集群中构建并运行 PyFlink 程序。你也可以使用 [Job Submission Examples]({{ site.baseurl 
}}/zh/deployment/cli.html#job-submission-examples) 中描述的命令将其提交到远程集群。
 
 最后,你可以在命令行上看到执行结果:
 
diff --git a/docs/dev/python/faq.md b/docs/dev/python/faq.md
index 8e71fea..e85fdc9 100644
--- a/docs/dev/python/faq.md
+++ b/docs/dev/python/faq.md
@@ -66,7 +66,7 @@ For details on the usage of `add_python_archive` and 
`set_python_executable`, yo
 ## Adding Jar Files
 
 A PyFlink job may depend on jar files, i.e. connectors, Java UDFs, etc.
-You can specify the dependencies with the following Python Table APIs or 
through [command-line arguments]({% link ops/cli.md %}#usage) directly when 
submitting the job.
+You can specify the dependencies with the following Python Table APIs or 
through [command-line arguments]({% link deployment/cli.md %}#usage) directly 
when submitting the job.
 
 {% highlight python %}
 # NOTE: Only local file URLs (start with "file:") are supported.
diff --git a/docs/dev/python/faq.zh.md b/docs/dev/python/faq.zh.md
index 4afdfb7..daa5673 100644
--- a/docs/dev/python/faq.zh.md
+++ b/docs/dev/python/faq.zh.md
@@ -65,7 +65,7 @@ $ 
table_env.get_config().set_python_executable("venv.zip/venv/bin/python")
 ## 添加Jar文件
 
 PyFlink作业可能依赖jar文件,比如connector,Java UDF等。
-您可以在提交作业时使用以下Python Table API或通过[命令行参数]({% link ops/cli.zh.md %}#usage)来指定依赖项。
+您可以在提交作业时使用以下Python Table API或通过[命令行参数]({% link deployment/cli.zh.md 
%}#usage)来指定依赖项。
 
 {% highlight python %}
 # 注意:仅支持本地文件URL(以"file:"开头)。
diff --git a/docs/dev/python/table-api-users-guide/dependency_management.md 
b/docs/dev/python/table-api-users-guide/dependency_management.md
index a160c9e..247f485 100644
--- a/docs/dev/python/table-api-users-guide/dependency_management.md
+++ b/docs/dev/python/table-api-users-guide/dependency_management.md
@@ -27,7 +27,7 @@ under the License.
 
 # Java Dependency in Python Program
 
-If third-party Java dependencies are used, you can specify the dependencies 
with the following Python Table APIs or through [command line arguments]({% 
link ops/cli.md %}#usage) directly when submitting the job.
+If third-party Java dependencies are used, you can specify the dependencies 
with the following Python Table APIs or through [command line arguments]({% 
link deployment/cli.md %}#usage) directly when submitting the job.
 
 {% highlight python %}
 # Specify a list of jar URLs via "pipeline.jars". The jars are separated by 
";" and will be uploaded to the cluster.
@@ -41,7 +41,7 @@ 
table_env.get_config().get_configuration().set_string("pipeline.classpaths", "fi
 
 # Python Dependency in Python Program
 
-If third-party Python dependencies are used, you can specify the dependencies 
with the following Python Table APIs or through [command line arguments]({% 
link ops/cli.md %}#usage) directly when submitting the job.
+If third-party Python dependencies are used, you can specify the dependencies 
with the following Python Table APIs or through [command line arguments]({% 
link deployment/cli.md %}#usage) directly when submitting the job.
 
 <table class="table table-bordered">
   <thead>
@@ -139,4 +139,4 @@ You can refer to the SQL statement about [CREATE 
FUNCTION]({% link  dev/table/sq
 on how to create Python user-defined functions using SQL statements.
 
 The Python dependencies could be specified via the Python [config options]({% 
link  dev/python/python_config.md %}#python-options),
-such as **python.archives**, **python.files**, **python.requirements**, 
**python.client.executable**, **python.executable**. etc or through [command 
line arguments]({% link ops/cli.md %}#usage) when submitting the job.
+such as **python.archives**, **python.files**, **python.requirements**, 
**python.client.executable**, **python.executable**. etc or through [command 
line arguments]({% link deployment/cli.md %}#usage) when submitting the job.
diff --git a/docs/dev/python/table-api-users-guide/dependency_management.zh.md 
b/docs/dev/python/table-api-users-guide/dependency_management.zh.md
index de9ab24..f1a021b 100644
--- a/docs/dev/python/table-api-users-guide/dependency_management.zh.md
+++ b/docs/dev/python/table-api-users-guide/dependency_management.zh.md
@@ -29,7 +29,7 @@ under the License.
 
 # Java 依赖管理
 
-如果应用了第三方 Java 依赖, 用户可以通过以下 Python Table API进行配置,或者在提交作业时直接通过[命令行参数]({% link 
ops/cli.zh.md %}#usage)配置。
+如果应用了第三方 Java 依赖, 用户可以通过以下 Python Table API进行配置,或者在提交作业时直接通过[命令行参数]({% link 
deployment/cli.zh.md %}#usage)配置。
 
 {% highlight python %}
 # 通过 "pipeline.jars" 参数指定 jar 包 URL列表, 每个 URL 使用 ";" 分隔。这些 jar 包最终会被上传到集群中。
@@ -45,7 +45,7 @@ 
table_env.get_config().get_configuration().set_string("pipeline.classpaths", "fi
 
 # Python 依赖管理
 
-如果程序中应用到了 Python 第三方依赖,用户可以使用以下 Table API 配置依赖信息,或在提交作业时直接通过[命令行参数]({% link 
ops/cli.zh.md %}#usage)配置。
+如果程序中应用到了 Python 第三方依赖,用户可以使用以下 Table API 配置依赖信息,或在提交作业时直接通过[命令行参数]({% link 
deployment/cli.zh.md %}#usage)配置。
 
 <table class="table table-bordered">
   <thead>
@@ -145,6 +145,6 @@ You can refer to the SQL statement about [CREATE 
FUNCTION]({% link  dev/table/sq
 on how to create Python user-defined functions using SQL statements.
 
 The Python dependencies could be specified via the Python [config options]({% 
link  dev/python/python_config.zh.md %}#python-options),
-such as **python.archives**, **python.files**, **python.requirements**, 
**python.client.executable**, **python.executable**. etc or through [command 
line arguments]({% link ops/cli.zh.md %}#usage) when submitting the job.
+such as **python.archives**, **python.files**, **python.requirements**, 
**python.client.executable**, **python.executable**. etc or through [command 
line arguments]({% link deployment/cli.zh.md %}#usage) when submitting the job.
 
 
diff --git a/docs/dev/python/table_api_tutorial.md 
b/docs/dev/python/table_api_tutorial.md
index e7924b2..474a2f7 100644
--- a/docs/dev/python/table_api_tutorial.md
+++ b/docs/dev/python/table_api_tutorial.md
@@ -187,7 +187,7 @@ $ python WordCount.py
 
 The command builds and runs the Python Table API program in a local mini 
cluster.
 You can also submit the Python Table API program to a remote cluster, you can 
refer
-[Job Submission Examples]({{ site.baseurl 
}}/ops/cli.html#job-submission-examples)
+[Job Submission Examples]({{ site.baseurl 
}}/deployment/cli.html#job-submission-examples)
 for more details.
 
 Finally, you can see the execution result on the command line:
diff --git a/docs/dev/python/table_api_tutorial.zh.md 
b/docs/dev/python/table_api_tutorial.zh.md
index fcfc623..0212d74 100644
--- a/docs/dev/python/table_api_tutorial.zh.md
+++ b/docs/dev/python/table_api_tutorial.zh.md
@@ -191,7 +191,7 @@ $ python WordCount.py
 {% endhighlight %}
 
 上述命令会构建Python Table API程序,并在本地mini cluster中运行。如果想将作业提交到远端集群执行,
-可以参考[作业提交示例]({{ site.baseurl }}/zh/ops/cli.html#job-submission-examples)。
+可以参考[作业提交示例]({{ site.baseurl 
}}/zh/deployment/cli.html#job-submission-examples)。
 
 最后,你可以通过如下命令查看你的运行结果:
 
diff --git a/docs/dev/table/sqlClient.md b/docs/dev/table/sqlClient.md
index 76f2f28..a4dc9f4 100644
--- a/docs/dev/table/sqlClient.md
+++ b/docs/dev/table/sqlClient.md
@@ -469,7 +469,7 @@ The SQL Client allows users to create custom, user-defined 
functions to be used
 
 In order to provide a Java/Scala user-defined function, you need to first 
implement and compile a function class that extends `ScalarFunction`, 
`AggregateFunction` or `TableFunction` (see [User-defined Functions]({{ 
site.baseurl }}/dev/table/functions/udfs.html)). One or more functions can then 
be packaged into a dependency JAR for the SQL Client.
 
-In order to provide a Python user-defined function, you need to write a Python 
function and decorate it with the `pyflink.table.udf.udf` or 
`pyflink.table.udf.udtf` decorator (see [Python UDFs]({% link 
dev/python/table-api-users-guide/udfs/python_udfs.md %})). One or more 
functions can then be placed into a Python file. The Python file and related 
dependencies need to be specified via the configuration (see [Python 
Configuration]({% link dev/python/python_config.md %})) in environment fi [...]
+In order to provide a Python user-defined function, you need to write a Python 
function and decorate it with the `pyflink.table.udf.udf` or 
`pyflink.table.udf.udtf` decorator (see [Python UDFs]({% link 
dev/python/table-api-users-guide/udfs/python_udfs.md %})). One or more 
functions can then be placed into a Python file. The Python file and related 
dependencies need to be specified via the configuration (see [Python 
Configuration]({% link dev/python/python_config.md %})) in environment fi [...]
 
 All functions must be declared in an environment file before being called. For 
each item in the list of `functions`, one must specify
 
diff --git a/docs/dev/table/sqlClient.zh.md b/docs/dev/table/sqlClient.zh.md
index 893cf25..cd53edd 100644
--- a/docs/dev/table/sqlClient.zh.md
+++ b/docs/dev/table/sqlClient.zh.md
@@ -469,7 +469,7 @@ SQL 客户端允许用户创建用户自定义的函数来进行 SQL 查询。
 
 为提供 Java/Scala 的自定义函数,你首先需要实现和编译函数类,该函数继承自 `ScalarFunction`、 
`AggregateFunction` 或 `TableFunction`(见[自定义函数]({{ site.baseurl 
}}/zh/dev/table/functions/udfs.html))。一个或多个函数可以打包到 SQL 客户端的 JAR 依赖中。
 
-为提供 Python 的自定义函数,你需要编写 Python 函数并且用装饰器 `pyflink.table.udf.udf` 或 
`pyflink.table.udf.udtf` 来装饰(见 [Python UDFs]({% link 
dev/python/table-api-users-guide/udfs/python_udfs.zh.md %})))。Python 
文件中可以放置一个或多个函数。其Python 文件和相关依赖需要通过在环境配置文件中或命令行选项(见 [命令行用法]({{ site.baseurl 
}}/zh/ops/cli.html#usage))配置中特别指定(见 [Python 配置]({% link 
dev/python/python_config.zh.md %}))。
+为提供 Python 的自定义函数,你需要编写 Python 函数并且用装饰器 `pyflink.table.udf.udf` 或 
`pyflink.table.udf.udtf` 来装饰(见 [Python UDFs]({% link 
dev/python/table-api-users-guide/udfs/python_udfs.zh.md %})))。Python 
文件中可以放置一个或多个函数。其Python 文件和相关依赖需要通过在环境配置文件中或命令行选项(见 [命令行用法]({{ site.baseurl 
}}/zh/deployment/cli.html#usage))配置中特别指定(见 [Python 配置]({% link 
dev/python/python_config.zh.md %}))。
 
 所有函数在被调用之前,必须在环境配置文件中提前声明。`functions` 列表中每个函数类都必须指定
 
diff --git a/docs/index.md b/docs/index.md
index 3cda3e8..78fd195 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -69,7 +69,7 @@ The reference documentation covers all the details. Some 
starting points:
 
 * [Configuration]({% link ops/config.md %})
 * [Rest API]({% link monitoring/rest_api.md %})
-* [CLI]({% link ops/cli.md %})
+* [CLI]({% link deployment/cli.md %})
 
 </div>
 </div>
diff --git a/docs/index.zh.md b/docs/index.zh.md
index e0f9148..b6fd4cd 100644
--- a/docs/index.zh.md
+++ b/docs/index.zh.md
@@ -69,7 +69,7 @@ Apache Flink 是一个在无界和有界数据流上进行状态计算的框架
 
 * [配置参数]({% link ops/config.zh.md %})
 * [Rest API]({% link monitoring/rest_api.zh.md %})
-* [CLI]({% link ops/cli.zh.md %})
+* [CLI]({% link deployment/cli.zh.md %})
 
 </div>
 </div>
diff --git a/docs/ops/state/checkpoints.md b/docs/ops/state/checkpoints.md
index 9be5dcf..f7d0a44 100644
--- a/docs/ops/state/checkpoints.md
+++ b/docs/ops/state/checkpoints.md
@@ -104,7 +104,7 @@ Checkpoints have a few differences from [savepoints]({% 
link ops/state/savepoint
 
 A job may be resumed from a checkpoint just as from a savepoint
 by using the checkpoint's meta data file instead (see the
-[savepoint restore guide]({% link ops/cli.md %}#restore-a-savepoint)). Note 
that if the
+[savepoint restore guide]({% link deployment/cli.md %}#restore-a-savepoint)). 
Note that if the
 meta data file is not self-contained, the jobmanager needs to have access to
 the data files it refers to (see [Directory Structure](#directory-structure)
 above).
diff --git a/docs/ops/state/checkpoints.zh.md b/docs/ops/state/checkpoints.zh.md
index c85decc..257b92c 100644
--- a/docs/ops/state/checkpoints.zh.md
+++ b/docs/ops/state/checkpoints.zh.md
@@ -87,7 +87,7 @@ Checkpoint 与 [savepoints]({% link ops/state/savepoints.zh.md 
%}) 有一些区
 
 ### 从保留的 checkpoint 中恢复状态
 
-与 savepoint 一样,作业可以从 checkpoint 的元数据文件恢复运行([savepoint恢复指南]({% link 
ops/cli.zh.md %}#restore-a-savepoint))。注意,如果元数据文件中信息不充分,那么 jobmanager 
就需要使用相关的数据文件来恢复作业(参考[目录结构](#directory-structure))。
+与 savepoint 一样,作业可以从 checkpoint 的元数据文件恢复运行([savepoint恢复指南]({% link 
deployment/cli.zh.md %}#restore-a-savepoint))。注意,如果元数据文件中信息不充分,那么 jobmanager 
就需要使用相关的数据文件来恢复作业(参考[目录结构](#directory-structure))。
 
 {% highlight shell %}
 $ bin/flink run -s :checkpointMetaDataPath [:runArgs]
diff --git a/docs/ops/state/savepoints.md b/docs/ops/state/savepoints.md
index 1340bd6..e73b29c 100644
--- a/docs/ops/state/savepoints.md
+++ b/docs/ops/state/savepoints.md
@@ -82,7 +82,7 @@ In the above example, the print sink is stateless and hence 
not part of the save
 
 ## Operations
 
-You can use the [command line client]({% link ops/cli.md %}#savepoints) to 
*trigger savepoints*, *cancel a job with a savepoint*, *resume from 
savepoints*, and *dispose savepoints*.
+You can use the [command line client]({% link deployment/cli.md %}#savepoints) 
to *trigger savepoints*, *cancel a job with a savepoint*, *resume from 
savepoints*, and *dispose savepoints*.
 
 With Flink >= 1.2.0 it is also possible to *resume from savepoints* using the 
webui.
 
diff --git a/docs/ops/state/savepoints.zh.md b/docs/ops/state/savepoints.zh.md
index 3e33ebd..81bf825 100644
--- a/docs/ops/state/savepoints.zh.md
+++ b/docs/ops/state/savepoints.zh.md
@@ -72,7 +72,7 @@ mapper-id   | State of StatefulMapper
 
 ## 算子
 
-你可以使用[命令行客户端]({% link ops/cli.zh.md %}#Savepoint)来*触发 Savepoint*,*触发 Savepoint 
并取消作业*,*从 Savepoint* 恢复,以及*删除 Savepoint*。
+你可以使用[命令行客户端]({% link deployment/cli.zh.md %}#Savepoint)来*触发 Savepoint*,*触发 
Savepoint 并取消作业*,*从 Savepoint* 恢复,以及*删除 Savepoint*。
 
 从 Flink 1.2.0 开始,还可以使用 webui *从 Savepoint 恢复*。
 
diff --git a/docs/redirects/cli.md b/docs/redirects/cli.md
index 59abf0d..5124a8a 100644
--- a/docs/redirects/cli.md
+++ b/docs/redirects/cli.md
@@ -1,7 +1,7 @@
 ---
 title: "CLI"
 layout: redirect
-redirect: /ops/cli.html
+redirect: /deployment/cli.html
 permalink: /apis/cli.html
 ---
 <!--
diff --git a/docs/try-flink/flink-operations-playground.md 
b/docs/try-flink/flink-operations-playground.md
index a126977..4213caa 100644
--- a/docs/try-flink/flink-operations-playground.md
+++ b/docs/try-flink/flink-operations-playground.md
@@ -169,7 +169,7 @@ After the initial startup you should mainly see log 
messages for every checkpoin
 
 ### Flink CLI
 
-The [Flink CLI]({% link ops/cli.md %}) can be used from within the client 
container. For
+The [Flink CLI]({% link deployment/cli.md %}) can be used from within the 
client container. For
 example, to print the `help` message of the Flink CLI you can run
 {% highlight bash%}
 docker-compose run --no-deps client flink --help
diff --git a/docs/try-flink/flink-operations-playground.zh.md 
b/docs/try-flink/flink-operations-playground.zh.md
index 41b31fe..74e1c02 100644
--- a/docs/try-flink/flink-operations-playground.zh.md
+++ b/docs/try-flink/flink-operations-playground.zh.md
@@ -167,7 +167,7 @@ TaskManager 刚启动完成之时,你同样会看到很多关于 checkpoint co
 
 ### Flink CLI
 
-[Flink CLI]({%link ops/cli.zh.md %}) 相关命令可以在 client 容器内进行使用。
+[Flink CLI]({%link deployment/cli.zh.md %}) 相关命令可以在 client 容器内进行使用。
 比如,想查看 Flink CLI 的 `help` 命令,可以通过如下方式进行查看:
 {% highlight bash%}
 docker-compose run --no-deps client flink --help

Reply via email to