This is an automated email from the ASF dual-hosted git repository.

twalthr pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
     new 383cf88  [FLINK-10532] [docs] Fix broken links in documentation
383cf88 is described below

commit 383cf887f9936d53d37ba907e03522fb8c88a67d
Author: Timo Walther <twal...@apache.org>
AuthorDate: Fri Oct 12 21:56:01 2018 +0200

    [FLINK-10532] [docs] Fix broken links in documentation
---
 docs/dev/stream/python.md                   | 2 +-
 docs/dev/table/connect.md                   | 2 +-
 docs/dev/table/streaming/temporal_tables.md | 2 +-
 docs/dev/table/tableApi.md                  | 2 +-
 4 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/dev/stream/python.md b/docs/dev/stream/python.md
index fd30064..c8d59e3 100644
--- a/docs/dev/stream/python.md
+++ b/docs/dev/stream/python.md
@@ -236,7 +236,7 @@ Data transformations transform one or more DataStreams into 
a new DataStream. Pr
 multiple transformations into sophisticated assemblies.
 
 This section gives a brief overview of the available transformations. The 
[transformations
-documentation](operators.html) has a full description of all transformations 
with
+documentation](./operators/index.html) has a full description of all 
transformations with
 examples.
 
 <br />
diff --git a/docs/dev/table/connect.md b/docs/dev/table/connect.md
index 983c673..0d76af9 100644
--- a/docs/dev/table/connect.md
+++ b/docs/dev/table/connect.md
@@ -597,7 +597,7 @@ Make sure to add the version-specific Kafka dependency. In 
addition, a correspon
 
 The Elasticsearch connector allows for writing into an index of the 
Elasticsearch search engine.
 
-The connector can operate in [upsert mode](#update-modes) for exchanging 
UPSERT/DELETE messages with the external system using a [key defined by the 
query](streaming.html#table-to-stream-conversion).
+The connector can operate in [upsert mode](#update-modes) for exchanging 
UPSERT/DELETE messages with the external system using a [key defined by the 
query](./streaming/dynamic_tables.html#table-to-stream-conversion).
 
 For append-only queries, the connector can also operate in [append 
mode](#update-modes) for exchanging only INSERT messages with the external 
system. If no key is defined by the query, a key is automatically generated by 
Elasticsearch.
 
diff --git a/docs/dev/table/streaming/temporal_tables.md 
b/docs/dev/table/streaming/temporal_tables.md
index 00e2648..2dd6ed7 100644
--- a/docs/dev/table/streaming/temporal_tables.md
+++ b/docs/dev/table/streaming/temporal_tables.md
@@ -186,4 +186,4 @@ Line `(1)` creates a `rates` [temporal table 
function](#temporal-table-functions
 which allows us to use the function `rates` in the [Table 
API](../tableApi.html#joins).
 
 Line `(2)` registers this function under the name `Rates` in our table 
environment,
-which allows us to use the `Rates` function in [SQL](sql.html#joins).
+which allows us to use the `Rates` function in [SQL](../sql.html#joins).
diff --git a/docs/dev/table/tableApi.md b/docs/dev/table/tableApi.md
index 44dd84f..6c4e1be 100644
--- a/docs/dev/table/tableApi.md
+++ b/docs/dev/table/tableApi.md
@@ -26,7 +26,7 @@ The Table API is a unified, relational API for stream and 
batch processing. Tabl
 
 The Table API shares many concepts and parts of its API with Flink's SQL 
integration. Have a look at the [Common Concepts & API]({{ site.baseurl 
}}/dev/table/common.html) to learn how to register tables or to create a 
`Table` object. The [Streaming Concepts](./streaming) pages discuss streaming 
specific concepts such as dynamic tables and time attributes.
 
-The following examples assume a registered table called `Orders` with 
attributes `(a, b, c, rowtime)`. The `rowtime` field is either a logical [time 
attribute](streaming/time_attributes.html) in streaming or a regular timestamp 
field in batch.
+The following examples assume a registered table called `Orders` with 
attributes `(a, b, c, rowtime)`. The `rowtime` field is either a logical [time 
attribute](./streaming/time_attributes.html) in streaming or a regular 
timestamp field in batch.
 
 * This will be replaced by the TOC
 {:toc}

Reply via email to