klion26 commented on a change in pull request #13271:
URL: https://github.com/apache/flink/pull/13271#discussion_r479893140



##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-The logging in Flink is implemented using the slf4j logging interface. As 
underlying logging framework, log4j2 is used. We also provide logback 
configuration files and pass them to the JVM's as properties. Users willing to 
use logback instead of log4j2 can just exclude log4j2 (or delete it from the 
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback 
日志配置,只要将其配置文件作为参数传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/ 
文件夹中删除它)即可。
 
 * This will be replaced by the TOC
 {:toc}
 
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
 
-Log4j2 is controlled using property files. In Flink's case, the file is 
usually called `log4j.properties`. We pass the filename and location of this 
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
 
-Flink ships with the following default properties files:
+Log4j2 是使用配置文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用 
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
 
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink 
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when 
starting a YARN or Kubernetes session (`yarn-session.sh`, 
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认日志配置文件:
 
-### Compatibility with Log4j1
+- `log4j-cli.properties`:由 Flink 命令行客户端使用(例如 `flink run`)(不包括在集群上执行的代码)
+- `log4j-session.properties`:Flink 命令行客户端在启动 YARN 或 Kubernetes session 
时使用(`yarn-session.sh`,`kubernetes-session.sh`)
+- `log4j.properties`:作为 JobManager/TaskManager 日志配置使用(standalone 和 YARN 
两种模式下皆使用)
 
-Flink ships with the [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html), 
allowing existing applications that work against Log4j1 classes to continue 
working.
+<a name="compatibility-with-log4j1"></a>
 
-If you have custom Log4j1 properties files or code that relies on Log4j1, 
please check out the official Log4j 
[compatibility](https://logging.apache.org/log4j/2.x/manual/compatibility.html) 
and [migration](https://logging.apache.org/log4j/2.x/manual/migration.html) 
guides.
+### 与 Log4j1 的兼容性
 
-## Configuring Log4j1
+Flink 附带了 [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),使得对
 Log4j1 工作的现有应用程序继续工作。
 
-To use Flink with Log4j1 you must ensure that:
-- `org.apache.logging.log4j:log4j-core`, 
`org.apache.logging.log4j:log4j-slf4j-impl` and 
`org.apache.logging.log4j:log4j-1.2-api` are not on the classpath,
-- `log4j:log4j`, `org.slf4j:slf4j-log4j12`, 
`org.apache.logging.log4j:log4j-to-slf4j` and 
`org.apache.logging.log4j:log4j-api` are on the classpath.
+如果你有基于 Log4j1 的自定义配置文件或代码,请查看官方 Log4j 
[兼容性](https://logging.apache.org/log4j/2.x/manual/compatibility.html)和[迁移](https://logging.apache.org/log4j/2.x/manual/migration.html)指南。
 
-In the IDE this means you have to replace such dependencies defined in your 
pom, and possibly add exclusions on dependencies that transitively depend on 
them.
+<a name="configuring-log4j1"></a>
 
-For Flink distributions this means you have to
-- remove the `log4j-core`, `log4j-slf4j-impl` and `log4j-1.2-api` jars from 
the `lib` directory,
-- add the `log4j`, `slf4j-log4j12` and `log4j-to-slf4j` jars to the `lib` 
directory,
-- replace all log4j properties files in the `conf` directory with 
Log4j1-compliant versions.
+## 配置 Log4j1
 
-## Configuring logback
+要将 Flink 与 Log4j1 一起使用,必须确保:
+- Classpath 中不存在 
`org.apache.logging.log4j:log4j-core`,`org.apache.logging.log4j:log4j-slf4j-impl`
 和 `org.apache.logging.log4j:log4j-1.2-api`,
+- 且 Classpath 中存在 
`log4j:log4j`,`org.slf4j:slf4j-log4j12`,`org.apache.logging.log4j:log4j-to-slf4j`
 和 `org.apache.logging.log4j:log4j-api`。
 
-For users and developers alike it is important to control the logging 
framework.
-The configuration of the logging framework is exclusively done by 
configuration files.
-The configuration file either has to be specified by setting the environment 
property `-Dlogback.configurationFile=<file>` or by putting `logback.xml` in 
the classpath.
-The `conf` directory contains a `logback.xml` file which can be modified and 
is used if Flink is started outside of an IDE and with the provided starting 
scripts.
-The provided `logback.xml` has the following form:
+在 IDE 中,这意味着你必须替换在 pom 文件中定义的依赖项,并尽可能在传递依赖于它们的依赖项上添加排除项。

Review comment:
       `并尽可能在传递依赖于它们的依赖项上添加排除项` 这句话能否再优化一下呢?读起来有点拗口

##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-The logging in Flink is implemented using the slf4j logging interface. As 
underlying logging framework, log4j2 is used. We also provide logback 
configuration files and pass them to the JVM's as properties. Users willing to 
use logback instead of log4j2 can just exclude log4j2 (or delete it from the 
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback 
日志配置,只要将其配置文件作为参数传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/ 
文件夹中删除它)即可。
 
 * This will be replaced by the TOC
 {:toc}
 
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
 
-Log4j2 is controlled using property files. In Flink's case, the file is 
usually called `log4j.properties`. We pass the filename and location of this 
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
 
-Flink ships with the following default properties files:
+Log4j2 是使用配置文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用 
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
 
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink 
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when 
starting a YARN or Kubernetes session (`yarn-session.sh`, 
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认日志配置文件:
 
-### Compatibility with Log4j1
+- `log4j-cli.properties`:由 Flink 命令行客户端使用(例如 `flink run`)(不包括在集群上执行的代码)
+- `log4j-session.properties`:Flink 命令行客户端在启动 YARN 或 Kubernetes session 
时使用(`yarn-session.sh`,`kubernetes-session.sh`)
+- `log4j.properties`:作为 JobManager/TaskManager 日志配置使用(standalone 和 YARN 
两种模式下皆使用)
 
-Flink ships with the [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html), 
allowing existing applications that work against Log4j1 classes to continue 
working.
+<a name="compatibility-with-log4j1"></a>
 
-If you have custom Log4j1 properties files or code that relies on Log4j1, 
please check out the official Log4j 
[compatibility](https://logging.apache.org/log4j/2.x/manual/compatibility.html) 
and [migration](https://logging.apache.org/log4j/2.x/manual/migration.html) 
guides.
+### 与 Log4j1 的兼容性
 
-## Configuring Log4j1
+Flink 附带了 [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),使得对
 Log4j1 工作的现有应用程序继续工作。
 
-To use Flink with Log4j1 you must ensure that:
-- `org.apache.logging.log4j:log4j-core`, 
`org.apache.logging.log4j:log4j-slf4j-impl` and 
`org.apache.logging.log4j:log4j-1.2-api` are not on the classpath,
-- `log4j:log4j`, `org.slf4j:slf4j-log4j12`, 
`org.apache.logging.log4j:log4j-to-slf4j` and 
`org.apache.logging.log4j:log4j-api` are on the classpath.
+如果你有基于 Log4j1 的自定义配置文件或代码,请查看官方 Log4j 
[兼容性](https://logging.apache.org/log4j/2.x/manual/compatibility.html)和[迁移](https://logging.apache.org/log4j/2.x/manual/migration.html)指南。

Review comment:
       `Log4j 的` 会更好一些吗?

##########
File path: docs/monitoring/logging.zh.md
##########
@@ -100,15 +106,13 @@ import org.slf4j.Logger
 Logger LOG = LoggerFactory.getLogger(Foobar.class)
 {% endhighlight %}
 
-In order to benefit most from slf4j, it is recommended to use its placeholder 
mechanism.
-Using placeholders allows to avoid unnecessary string constructions in case 
that the logging level is set so high that the message would not be logged.
-The syntax of placeholders is the following:
+为了最大限度地利用 slf4j,建议使用其占位符机制。使用占位符可以避免不必要的字符串构造,以防日志级别设置得太高而不会记录消息。占位符的语法如下:

Review comment:
       beneift 是不是翻译成受益会更好一些呢?(整体意思或语句可能需要调整)后面是解释为什么使用占位符会更受益
   后面一句话的意思也需要修改一下,如果我理解没错的话,是说使用占位符可以避免一些不必要的字符串构建(比如 log 设置为 
WARN,那么使用占位符的话,DEBUG 和 INFO 的 log 就不会有字符串构造了)

##########
File path: docs/monitoring/logging.zh.md
##########
@@ -23,47 +23,51 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-The logging in Flink is implemented using the slf4j logging interface. As 
underlying logging framework, log4j2 is used. We also provide logback 
configuration files and pass them to the JVM's as properties. Users willing to 
use logback instead of log4j2 can just exclude log4j2 (or delete it from the 
lib/ folder).
+Flink 中的日志记录是使用 slf4j 日志接口实现的。使用 log4j2 作为底层日志框架。我们也支持了 logback 
日志配置,只要将其配置文件作为参数传递给 JVM 即可。愿意使用 logback 而不是 log4j2 的用户只需排除 log4j2 的依赖(或从 lib/ 
文件夹中删除它)即可。
 
 * This will be replaced by the TOC
 {:toc}
 
-## Configuring Log4j2
+<a name="configuring-log4j2"></a>
 
-Log4j2 is controlled using property files. In Flink's case, the file is 
usually called `log4j.properties`. We pass the filename and location of this 
file using the `-Dlog4j.configurationFile=` parameter to the JVM.
+## 配置 Log4j2
 
-Flink ships with the following default properties files:
+Log4j2 是使用配置文件指定的。在 Flink 的使用中,该文件通常命名为 `log4j.properties`。我们使用 
`-Dlog4j.configurationFile=` 参数将该文件的文件名和位置传递给 JVM。
 
-- `log4j-cli.properties`: Used by the Flink command line client (e.g. `flink 
run`) (not code executed on the cluster)
-- `log4j-session.properties`: Used by the Flink command line client when 
starting a YARN or Kubernetes session (`yarn-session.sh`, 
`kubernetes-session.sh`)
-- `log4j.properties`: JobManager/Taskmanager logs (both standalone and YARN)
+Flink 附带以下默认日志配置文件:
 
-### Compatibility with Log4j1
+- `log4j-cli.properties`:由 Flink 命令行客户端使用(例如 `flink run`)(不包括在集群上执行的代码)
+- `log4j-session.properties`:Flink 命令行客户端在启动 YARN 或 Kubernetes session 
时使用(`yarn-session.sh`,`kubernetes-session.sh`)
+- `log4j.properties`:作为 JobManager/TaskManager 日志配置使用(standalone 和 YARN 
两种模式下皆使用)
 
-Flink ships with the [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html), 
allowing existing applications that work against Log4j1 classes to continue 
working.
+<a name="compatibility-with-log4j1"></a>
 
-If you have custom Log4j1 properties files or code that relies on Log4j1, 
please check out the official Log4j 
[compatibility](https://logging.apache.org/log4j/2.x/manual/compatibility.html) 
and [migration](https://logging.apache.org/log4j/2.x/manual/migration.html) 
guides.
+### 与 Log4j1 的兼容性
 
-## Configuring Log4j1
+Flink 附带了 [Log4j API 
bridge](https://logging.apache.org/log4j/log4j-2.2/log4j-1.2-api/index.html),使得对
 Log4j1 工作的现有应用程序继续工作。

Review comment:
       `使得对 Log4j1 工作的现有应用程序继续工作` 这句话意思能清楚,读起来有一点拗口,能否改进一下呢?比如 `使得现在作业能够继续使用 
log4j1 的接口` (类似的描述)

##########
File path: docs/monitoring/logging.zh.md
##########
@@ -81,17 +85,19 @@ The provided `logback.xml` has the following form:
 </configuration>
 {% endhighlight %}
 
-In order to control the logging level of 
`org.apache.flink.runtime.jobgraph.JobGraph`, for example, one would have to 
add the following line to the configuration file.
+例如,为了控制 `org.apache.flink.runtime.jobgraph.JobGraph` 的日志记录级别,必须将以下行添加到配置文件中。
 
 {% highlight xml %}
 <logger name="org.apache.flink.runtime.jobgraph.JobGraph" level="DEBUG"/>
 {% endhighlight %}
 
-For further information on configuring logback see [LOGback's 
manual](http://logback.qos.ch/manual/configuration.html).
+有关配置日志的更多信息,请参见 [LOGback 手册](http://logback.qos.ch/manual/configuration.html)。
+
+<a name="best-practices-for-developers"></a>
 
-## Best practices for developers
+## 开发人员的最佳实践
 
-The loggers using slf4j are created by calling
+Slf4j 的 loggers 通过调用来创建

Review comment:
       这句话读起来感觉有点怪怪的,这句话没有和下面的代码有联系,原文中这句话会和下面的代码有联系




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to