[hotfix] [doc] Fix several broken "Linking with Flink" links This closes #3137.
Project: http://git-wip-us.apache.org/repos/asf/flink/repo Commit: http://git-wip-us.apache.org/repos/asf/flink/commit/6a2970a3 Tree: http://git-wip-us.apache.org/repos/asf/flink/tree/6a2970a3 Diff: http://git-wip-us.apache.org/repos/asf/flink/diff/6a2970a3 Branch: refs/heads/master Commit: 6a2970a3bb538d17ef9cf103b9e3f0e65a9224c1 Parents: 525edf1 Author: Tzu-Li (Gordon) Tai <tzuli...@apache.org> Authored: Tue Jan 17 12:21:29 2017 +0100 Committer: zentol <ches...@apache.org> Committed: Thu Jan 19 23:57:23 2017 +0100 ---------------------------------------------------------------------- docs/dev/batch/index.md | 2 +- docs/dev/connectors/cassandra.md | 2 +- docs/dev/connectors/elasticsearch.md | 2 +- docs/dev/connectors/elasticsearch2.md | 4 ++-- docs/dev/connectors/filesystem_sink.md | 2 +- docs/dev/connectors/kafka.md | 2 +- docs/dev/connectors/kinesis.md | 2 +- docs/dev/connectors/nifi.md | 2 +- docs/dev/connectors/rabbitmq.md | 2 +- docs/dev/connectors/twitter.md | 2 +- docs/dev/libs/cep.md | 4 ++-- docs/dev/libs/gelly/index.md | 2 +- docs/dev/libs/ml/index.md | 4 ++-- docs/dev/libs/ml/quickstart.md | 2 +- 14 files changed, 17 insertions(+), 17 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/batch/index.md ---------------------------------------------------------------------- diff --git a/docs/dev/batch/index.md b/docs/dev/batch/index.md index 48d60e1..52807ca 100644 --- a/docs/dev/batch/index.md +++ b/docs/dev/batch/index.md @@ -49,7 +49,7 @@ Example Program The following program is a complete, working example of WordCount. You can copy & paste the code to run it locally. You only have to include the correct Flink's library into your project -(see Section [Linking with Flink]({{ site.baseurl }}/dev/linking_with_flink)) and specify the imports. Then you are ready +(see Section [Linking with Flink]({{ site.baseurl }}/dev/linking_with_flink.html)) and specify the imports. Then you are ready to go! <div class="codetabs" markdown="1"> http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/cassandra.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/cassandra.md b/docs/dev/connectors/cassandra.md index 19d483b..7f76b72 100644 --- a/docs/dev/connectors/cassandra.md +++ b/docs/dev/connectors/cassandra.md @@ -35,7 +35,7 @@ To use this connector, add the following dependency to your project: </dependency> {% endhighlight %} -Note that the streaming connectors are currently not part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/linking). +Note that the streaming connectors are currently not part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/linking.html). #### Installing Apache Cassandra Follow the instructions from the [Cassandra Getting Started page](http://wiki.apache.org/cassandra/GettingStarted). http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/elasticsearch.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/elasticsearch.md b/docs/dev/connectors/elasticsearch.md index 1907740..3e8c68a 100644 --- a/docs/dev/connectors/elasticsearch.md +++ b/docs/dev/connectors/elasticsearch.md @@ -37,7 +37,7 @@ following dependency to your project: Note that the streaming connectors are currently not part of the binary distribution. See -[here]({{site.baseurl}}/dev/linking) +[here]({{site.baseurl}}/dev/linking.html) for information about how to package the program with the libraries for cluster execution. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/elasticsearch2.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/elasticsearch2.md b/docs/dev/connectors/elasticsearch2.md index a796280..af02c84 100644 --- a/docs/dev/connectors/elasticsearch2.md +++ b/docs/dev/connectors/elasticsearch2.md @@ -37,7 +37,7 @@ following dependency to your project: Note that the streaming connectors are currently not part of the binary distribution. See -[here]({{site.baseurl}}/dev/linking) +[here]({{site.baseurl}}/dev/linking.html) for information about how to package the program with the libraries for cluster execution. @@ -145,7 +145,7 @@ More information about Elasticsearch can be found [here](https://elastic.co). For the execution of your Flink program, it is recommended to build a so-called uber-jar (executable jar) containing all your dependencies -(see [here]({{site.baseurl}}/dev/linking) for further information). +(see [here]({{site.baseurl}}/dev/linking.html) for further information). However, when an uber-jar containing an Elasticsearch sink is executed, http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/filesystem_sink.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/filesystem_sink.md b/docs/dev/connectors/filesystem_sink.md index 030e9d9..0fa8bb1 100644 --- a/docs/dev/connectors/filesystem_sink.md +++ b/docs/dev/connectors/filesystem_sink.md @@ -37,7 +37,7 @@ following dependency to your project: Note that the streaming connectors are currently not part of the binary distribution. See -[here]({{site.baseurl}}/dev/linking) +[here]({{site.baseurl}}/dev/linking.html) for information about how to package the program with the libraries for cluster execution. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/kafka.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/kafka.md b/docs/dev/connectors/kafka.md index e09befe..cc51071 100644 --- a/docs/dev/connectors/kafka.md +++ b/docs/dev/connectors/kafka.md @@ -82,7 +82,7 @@ Then, import the connector in your maven project: </dependency> {% endhighlight %} -Note that the streaming connectors are currently not part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/linking). +Note that the streaming connectors are currently not part of the binary distribution. See how to link with them for cluster execution [here]({{ site.baseurl}}/dev/linking.html). ### Installing Apache Kafka http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/kinesis.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/kinesis.md b/docs/dev/connectors/kinesis.md index 480a97d..d95fe21 100644 --- a/docs/dev/connectors/kinesis.md +++ b/docs/dev/connectors/kinesis.md @@ -51,7 +51,7 @@ mvn clean install -Pinclude-kinesis -DskipTests The streaming connectors are not part of the binary distribution. See how to link with them for cluster -execution [here]({{site.baseurl}}/dev/linking). +execution [here]({{site.baseurl}}/dev/linking.html). ### Using the Amazon Kinesis Streams Service Follow the instructions from the [Amazon Kinesis Streams Developer Guide](https://docs.aws.amazon.com/streams/latest/dev/learning-kinesis-module-one-create-stream.html) http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/nifi.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/nifi.md b/docs/dev/connectors/nifi.md index bdbd808..aa9eba2 100644 --- a/docs/dev/connectors/nifi.md +++ b/docs/dev/connectors/nifi.md @@ -37,7 +37,7 @@ following dependency to your project: Note that the streaming connectors are currently not part of the binary distribution. See -[here]({{site.baseurl}}/dev/linking) +[here]({{site.baseurl}}/dev/linking.html) for information about how to package the program with the libraries for cluster execution. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/rabbitmq.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/rabbitmq.md b/docs/dev/connectors/rabbitmq.md index 7f117c6..47b5998 100644 --- a/docs/dev/connectors/rabbitmq.md +++ b/docs/dev/connectors/rabbitmq.md @@ -33,7 +33,7 @@ This connector provides access to data streams from [RabbitMQ](http://www.rabbit </dependency> {% endhighlight %} -Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{site.baseurl}}/dev/linking). +Note that the streaming connectors are currently not part of the binary distribution. See linking with them for cluster execution [here]({{site.baseurl}}/dev/linking.html). #### Installing RabbitMQ Follow the instructions from the [RabbitMQ download page](http://www.rabbitmq.com/download.html). After the installation the server automatically starts, and the application connecting to RabbitMQ can be launched. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/connectors/twitter.md ---------------------------------------------------------------------- diff --git a/docs/dev/connectors/twitter.md b/docs/dev/connectors/twitter.md index 9b6a019..be15aaf 100644 --- a/docs/dev/connectors/twitter.md +++ b/docs/dev/connectors/twitter.md @@ -36,7 +36,7 @@ To use this connector, add the following dependency to your project: {% endhighlight %} Note that the streaming connectors are currently not part of the binary distribution. -See linking with them for cluster execution [here]({{site.baseurl}}/dev/linking). +See linking with them for cluster execution [here]({{site.baseurl}}/dev/linking.html). #### Authentication In order to connect to the Twitter stream the user has to register their program and acquire the necessary information for the authentication. The process is described below. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/libs/cep.md ---------------------------------------------------------------------- diff --git a/docs/dev/libs/cep.md b/docs/dev/libs/cep.md index c30d37b..8047481 100644 --- a/docs/dev/libs/cep.md +++ b/docs/dev/libs/cep.md @@ -37,7 +37,7 @@ because these are used for comparing and matching events. ## Getting Started -If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink). +If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink.html). Next, you have to add the FlinkCEP dependency to the `pom.xml` of your project. <div class="codetabs" markdown="1"> @@ -63,7 +63,7 @@ Next, you have to add the FlinkCEP dependency to the `pom.xml` of your project. </div> Note that FlinkCEP is currently not part of the binary distribution. -See linking with it for cluster execution [here]({{site.baseurl}}/dev/linking). +See linking with it for cluster execution [here]({{site.baseurl}}/dev/linking.html). Now you can start writing your first CEP program using the pattern API. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/libs/gelly/index.md ---------------------------------------------------------------------- diff --git a/docs/dev/libs/gelly/index.md b/docs/dev/libs/gelly/index.md index 6bcdc82..d82fec0 100644 --- a/docs/dev/libs/gelly/index.md +++ b/docs/dev/libs/gelly/index.md @@ -63,7 +63,7 @@ Add the following dependency to your `pom.xml` to use Gelly. </div> </div> -Note that Gelly is currently not part of the binary distribution. See linking with it for cluster execution [here]({{ site.baseurl }}/dev/linking). +Note that Gelly is currently not part of the binary distribution. See linking with it for cluster execution [here]({{ site.baseurl }}/dev/linking.html). The remaining sections provide a description of available methods and present several examples of how to use Gelly and how to mix it with the Flink DataSet API. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/libs/ml/index.md ---------------------------------------------------------------------- diff --git a/docs/dev/libs/ml/index.md b/docs/dev/libs/ml/index.md index 129be32..fbe3dce 100644 --- a/docs/dev/libs/ml/index.md +++ b/docs/dev/libs/ml/index.md @@ -72,7 +72,7 @@ FlinkML currently supports the following algorithms: You can check out our [quickstart guide](quickstart.html) for a comprehensive getting started example. -If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink). +If you want to jump right in, you have to [set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink.html). Next, you have to add the FlinkML dependency to the `pom.xml` of your project. {% highlight xml %} @@ -84,7 +84,7 @@ Next, you have to add the FlinkML dependency to the `pom.xml` of your project. {% endhighlight %} Note that FlinkML is currently not part of the binary distribution. -See linking with it for cluster execution [here]({{site.baseurl}}/dev/linking). +See linking with it for cluster execution [here]({{site.baseurl}}/dev/linking.html). Now you can start solving your analysis task. The following code snippet shows how easy it is to train a multiple linear regression model. http://git-wip-us.apache.org/repos/asf/flink/blob/6a2970a3/docs/dev/libs/ml/quickstart.md ---------------------------------------------------------------------- diff --git a/docs/dev/libs/ml/quickstart.md b/docs/dev/libs/ml/quickstart.md index 29f2fec..5dff6bb 100644 --- a/docs/dev/libs/ml/quickstart.md +++ b/docs/dev/libs/ml/quickstart.md @@ -55,7 +55,7 @@ through [principal components analysis](https://en.wikipedia.org/wiki/Principal_ ## Linking with FlinkML In order to use FlinkML in your project, first you have to -[set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink). +[set up a Flink program]({{ site.baseurl }}/dev/linking_with_flink.html). Next, you have to add the FlinkML dependency to the `pom.xml` of your project: {% highlight xml %}