This is an automated email from the ASF dual-hosted git repository.

liyu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/flink-web.git

commit 355c2ab49228658ad80cf5f1bc3e97cbac7cc479
Author: Yu Li <l...@apache.org>
AuthorDate: Thu Jun 25 14:15:43 2020 +0800

    Rebuild website
---
 content/blog/feed.xml                              | 350 +++++++-------------
 content/blog/index.html                            |  39 ++-
 content/blog/page10/index.html                     |  40 ++-
 content/blog/page11/index.html                     |  40 ++-
 content/blog/page12/index.html                     |  25 ++
 content/blog/page2/index.html                      |  36 +-
 content/blog/page3/index.html                      |  36 +-
 content/blog/page4/index.html                      |  36 +-
 content/blog/page5/index.html                      |  38 ++-
 content/blog/page6/index.html                      |  38 ++-
 content/blog/page7/index.html                      |  38 ++-
 content/blog/page8/index.html                      |  40 ++-
 content/blog/page9/index.html                      |  40 ++-
 .../2020/06/23/flink-on-zeppelin-part2.html        | 365 +++++++++++++++++++++
 .../flink_append_mode.gif                          | Bin 0 -> 294307 bytes
 .../flink_python_udf.png                           | Bin 0 -> 83093 bytes
 .../flink_scala_udf.png                            | Bin 0 -> 84516 bytes
 .../flink_single_mode.gif                          | Bin 0 -> 58198 bytes
 .../flink_update_mode.gif                          | Bin 0 -> 131055 bytes
 content/index.html                                 |   9 +-
 content/zh/index.html                              |   9 +-
 21 files changed, 779 insertions(+), 400 deletions(-)

diff --git a/content/blog/feed.xml b/content/blog/feed.xml
index eabf57c..e191286 100644
--- a/content/blog/feed.xml
+++ b/content/blog/feed.xml
@@ -7,6 +7,117 @@
 <atom:link href="https://flink.apache.org/blog/feed.xml"; rel="self" 
type="application/rss+xml" />
 
 <item>
+<title>Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 
2</title>
+<description>&lt;p&gt;In a previous post, we introduced the basics of Flink on 
Zeppelin and how to do Streaming ETL. In this second part of the “Flink on 
Zeppelin” series of posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin.&lt;/p&gt;
+
+&lt;h1 id=&quot;streaming-data-visualization&quot;&gt;Streaming Data 
Visualization&lt;/h1&gt;
+
+&lt;p&gt;With &lt;a 
href=&quot;https://zeppelin.apache.org/&quot;&gt;Zeppelin&lt;/a&gt;, you can 
build a real time streaming dashboard without writing any line of 
javascript/html/css code.&lt;/p&gt;
+
+&lt;p&gt;Overall, Zeppelin supports 3 kinds of streaming data 
analytics:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;Single Mode&lt;/li&gt;
+  &lt;li&gt;Update Mode&lt;/li&gt;
+  &lt;li&gt;Append Mode&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;h3 id=&quot;single-mode&quot;&gt;Single Mode&lt;/h3&gt;
+&lt;p&gt;Single mode is used for cases when the result of a SQL statement is 
always one row, such as the following example. 
+The output format is translated in HTML, and you can specify a paragraph local 
property template for the final output content template. 
+And you can use &lt;code&gt;{i}&lt;/code&gt; as placeholder for the {i}th 
column of the result.&lt;/p&gt;
+
+&lt;center&gt;
+&lt;img 
src=&quot;/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif&quot;
 width=&quot;80%&quot; alt=&quot;Single Mode&quot; /&gt;
+&lt;/center&gt;
+
+&lt;h3 id=&quot;update-mode&quot;&gt;Update Mode&lt;/h3&gt;
+&lt;p&gt;Update mode is suitable for the cases when the output format is more 
than one row, 
+and will always be continuously updated. Here’s one example where we use 
&lt;code&gt;GROUP BY&lt;/code&gt;.&lt;/p&gt;
+
+&lt;center&gt;
+&lt;img 
src=&quot;/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif&quot;
 width=&quot;80%&quot; alt=&quot;Update Mode&quot; /&gt;
+&lt;/center&gt;
+
+&lt;h3 id=&quot;append-mode&quot;&gt;Append Mode&lt;/h3&gt;
+&lt;p&gt;Append mode is suitable for the cases when the output data is always 
appended. 
+For instance, the example below uses a tumble window.&lt;/p&gt;
+
+&lt;center&gt;
+&lt;img 
src=&quot;/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif&quot;
 width=&quot;80%&quot; alt=&quot;Append Mode&quot; /&gt;
+&lt;/center&gt;
+
+&lt;h1 id=&quot;udf&quot;&gt;UDF&lt;/h1&gt;
+
+&lt;p&gt;SQL is a very powerful language, especially in expressing data flow. 
But most of the time, you need to handle complicated business logic that cannot 
be expressed by SQL.
+In these cases UDFs (user-defined functions) come particularly handy. In 
Zeppelin, you can write Scala or Python UDFs, while you can also import Scala, 
Python and Java UDFs.
+Here are 2 examples of Scala and Python UDFs:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;Scala UDF&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;div class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;code 
class=&quot;language-scala&quot;&gt;&lt;span 
class=&quot;o&quot;&gt;%&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;flink&lt;/span&gt;
+
+&lt;span class=&quot;k&quot;&gt;class&lt;/span&gt; &lt;span 
class=&quot;nc&quot;&gt;ScalaUpper&lt;/span&gt; &lt;span 
class=&quot;k&quot;&gt;extends&lt;/span&gt; &lt;span 
class=&quot;nc&quot;&gt;ScalarFunction&lt;/span&gt; &lt;span 
class=&quot;o&quot;&gt;{&lt;/span&gt;
+&lt;span class=&quot;k&quot;&gt;def&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;eval&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;str&lt;/span&gt;&lt;span 
class=&quot;k&quot;&gt;:&lt;/span&gt; &lt;span 
class=&quot;kt&quot;&gt;String&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;)&lt;/span&gt; &lt;span 
class=&quot;k&quot;&gt;=&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;str&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span clas [...]
+&lt;span class=&quot;o&quot;&gt;}&lt;/span&gt;
+&lt;span class=&quot;n&quot;&gt;btenv&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;registerFunction&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;s&quot;&gt;&amp;quot;scala_upper&amp;quot;&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;,&lt;/span&gt; &lt;span 
class=&quot;k&quot;&gt;new&lt;/span&gt; &lt;span 
class=&quot;nc&quot;&gt;ScalaUpper&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;())&lt;/span&gt;&lt;/code&gt; [...]
+
+&lt;ul&gt;
+  &lt;li&gt;Python UDF&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;div class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;code 
class=&quot;language-python&quot;&gt;&lt;span 
class=&quot;o&quot;&gt;%&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;flink&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;pyflink&lt;/span&gt;
+
+&lt;span class=&quot;k&quot;&gt;class&lt;/span&gt; &lt;span 
class=&quot;nc&quot;&gt;PythonUpper&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;ScalarFunction&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;):&lt;/span&gt;
+&lt;span class=&quot;k&quot;&gt;def&lt;/span&gt; &lt;span 
class=&quot;nf&quot;&gt;eval&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;bp&quot;&gt;self&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;s&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;):&lt;/span&gt;
+ &lt;span class=&quot;k&quot;&gt;return&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;s&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;upper&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;()&lt;/span&gt;
+
+&lt;span class=&quot;n&quot;&gt;bt_env&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;register_function&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;s&quot;&gt;&amp;quot;python_upper&amp;quot;&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;,&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;udf&lt;/span&gt;&lt;span 
class=&quot;p&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;PythonUpper&lt;/span&gt;&lt;span clas [...]
+
+&lt;p&gt;After you define the UDFs, you can use them directly in SQL:&lt;/p&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;Use Scala UDF in SQL&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;center&gt;
+&lt;img 
src=&quot;/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png&quot;
 width=&quot;100%&quot; alt=&quot;Scala UDF&quot; /&gt;
+&lt;/center&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;Use Python UDF in SQL&lt;/li&gt;
+&lt;/ul&gt;
+
+&lt;center&gt;
+&lt;img 
src=&quot;/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png&quot;
 width=&quot;100%&quot; alt=&quot;Python UDF&quot; /&gt;
+&lt;/center&gt;
+
+&lt;h1 id=&quot;summary&quot;&gt;Summary&lt;/h1&gt;
+
+&lt;p&gt;In this post, we explained how to perform streaming data 
visualization via Flink on Zeppelin and how to use UDFs. 
+Besides that, you can do more in Zeppelin with Flink, such as batch 
processing, Hive integration and more.
+You can check the following articles for more details and here’s a list of 
&lt;a 
href=&quot;https://www.youtube.com/watch?v=YxPo0Fosjjg&amp;amp;list=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX&quot;&gt;Flink
 on Zeppelin tutorial videos&lt;/a&gt; for your reference.&lt;/p&gt;
+
+&lt;h1 id=&quot;references&quot;&gt;References&lt;/h1&gt;
+
+&lt;ul&gt;
+  &lt;li&gt;&lt;a href=&quot;http://zeppelin.apache.org&quot;&gt;Apache 
Zeppelin official website&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;Flink on Zeppelin tutorials - &lt;a 
href=&quot;https://medium.com/@zjffdu/flink-on-zeppelin-part-1-get-started-2591aaa6aa47&quot;&gt;Part
 1&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;Flink on Zeppelin tutorials - &lt;a 
href=&quot;https://medium.com/@zjffdu/flink-on-zeppelin-part-2-batch-711731df5ad9&quot;&gt;Part
 2&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;Flink on Zeppelin tutorials - &lt;a 
href=&quot;https://medium.com/@zjffdu/flink-on-zeppelin-part-3-streaming-5fca1e16754&quot;&gt;Part
 3&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;Flink on Zeppelin tutorials - &lt;a 
href=&quot;https://medium.com/@zjffdu/flink-on-zeppelin-part-4-advanced-usage-998b74908cd9&quot;&gt;Part
 4&lt;/a&gt;&lt;/li&gt;
+  &lt;li&gt;&lt;a 
href=&quot;https://www.youtube.com/watch?v=YxPo0Fosjjg&amp;amp;list=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX&quot;&gt;Flink
 on Zeppelin tutorial videos&lt;/a&gt;&lt;/li&gt;
+&lt;/ul&gt;
+</description>
+<pubDate>Tue, 23 Jun 2020 10:00:00 +0200</pubDate>
+<link>https://flink.apache.org/ecosystem/2020/06/23/flink-on-zeppelin-part2.html</link>
+<guid 
isPermaLink="true">/ecosystem/2020/06/23/flink-on-zeppelin-part2.html</guid>
+</item>
+
+<item>
 <title>Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 
1</title>
 <description>&lt;p&gt;The latest release of &lt;a 
href=&quot;https://zeppelin.apache.org/&quot;&gt;Apache Zeppelin&lt;/a&gt; 
comes with a redesigned interpreter for Apache Flink (version Flink 1.10+ is 
only supported moving forward) 
 that allows developers to use Flink directly on Zeppelin notebooks for 
interactive data analysis. I wrote 2 posts about how to use Flink in Zeppelin. 
This is part-1 where I explain how the Flink interpreter in Zeppelin works, 
@@ -16772,244 +16883,5 @@ tools, graph database systems and sampling 
techniques.&lt;/p&gt;
 <guid isPermaLink="true">/news/2015/08/24/introducing-flink-gelly.html</guid>
 </item>
 
-<item>
-<title>Announcing Apache Flink 0.9.0</title>
-<description>&lt;p&gt;The Apache Flink community is pleased to announce the 
availability of the 0.9.0 release. The release is the result of many months of 
hard work within the Flink community. It contains many new features and 
improvements which were previewed in the 0.9.0-milestone1 release and have been 
polished since then. This is the largest Flink release so far.&lt;/p&gt;
-
-&lt;p&gt;&lt;a 
href=&quot;http://flink.apache.org/downloads.html&quot;&gt;Download the 
release&lt;/a&gt; and check out &lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/&quot;&gt;the
 documentation&lt;/a&gt;. Feedback through the Flink&lt;a 
href=&quot;http://flink.apache.org/community.html#mailing-lists&quot;&gt; 
mailing lists&lt;/a&gt; is, as always, very welcome!&lt;/p&gt;
-
-&lt;h2 id=&quot;new-features&quot;&gt;New Features&lt;/h2&gt;
-
-&lt;h3 
id=&quot;exactly-once-fault-tolerance-for-streaming-programs&quot;&gt;Exactly-once
 Fault Tolerance for streaming programs&lt;/h3&gt;
-
-&lt;p&gt;This release introduces a new fault tolerance mechanism for streaming 
dataflows. The new checkpointing algorithm takes data sources and also 
user-defined state into account and recovers failures such that all records are 
reflected exactly once in the operator states.&lt;/p&gt;
-
-&lt;p&gt;The checkpointing algorithm is lightweight and driven by barriers 
that are periodically injected into the data streams at the sources. As such, 
it has an extremely low coordination overhead and is able to sustain very high 
throughput rates. User-defined state can be automatically backed up to 
configurable storage by the fault tolerance mechanism.&lt;/p&gt;
-
-&lt;p&gt;Please refer to &lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/apis/streaming_guide.html#stateful-computation&quot;&gt;the
 documentation on stateful computation&lt;/a&gt; for details in how to use 
fault tolerant data streams with Flink.&lt;/p&gt;
-
-&lt;p&gt;The fault tolerance mechanism requires data sources that can replay 
recent parts of the stream, such as &lt;a 
href=&quot;http://kafka.apache.org&quot;&gt;Apache Kafka&lt;/a&gt;. Read more 
&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/apis/streaming_guide.html#apache-kafka&quot;&gt;about
 how to use the persistent Kafka source&lt;/a&gt;.&lt;/p&gt;
-
-&lt;h3 id=&quot;table-api&quot;&gt;Table API&lt;/h3&gt;
-
-&lt;p&gt;Flink’s new Table API offers a higher-level abstraction for 
interacting with structured data sources. The Table API allows users to execute 
logical, SQL-like queries on distributed data sets while allowing them to 
freely mix declarative queries with regular Flink operators. Here is an example 
that groups and joins two tables:&lt;/p&gt;
-
-&lt;div class=&quot;highlight&quot;&gt;&lt;pre&gt;&lt;code 
class=&quot;language-scala&quot;&gt;&lt;span 
class=&quot;k&quot;&gt;val&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;clickCounts&lt;/span&gt; &lt;span 
class=&quot;k&quot;&gt;=&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;clicks&lt;/span&gt;
-  &lt;span class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;groupBy&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;-Symbol&quot;&gt;&amp;#39;user&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;).&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;select&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;-Symbol&quot;&gt;&amp;#39;userId&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;,&lt;/span&gt; &lt;span class=&quot;-Sym [...]
-
-&lt;span class=&quot;k&quot;&gt;val&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;activeUsers&lt;/span&gt; &lt;span 
class=&quot;k&quot;&gt;=&lt;/span&gt; &lt;span 
class=&quot;n&quot;&gt;users&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;join&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;clickCounts&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;)&lt;/span&gt;
-  &lt;span class=&quot;o&quot;&gt;.&lt;/span&gt;&lt;span 
class=&quot;n&quot;&gt;where&lt;/span&gt;&lt;span 
class=&quot;o&quot;&gt;(&lt;/span&gt;&lt;span 
class=&quot;-Symbol&quot;&gt;&amp;#39;id&lt;/span&gt; &lt;span 
class=&quot;o&quot;&gt;===&lt;/span&gt; &lt;span 
class=&quot;-Symbol&quot;&gt;&amp;#39;userId&lt;/span&gt; &lt;span 
class=&quot;o&quot;&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span 
class=&quot;-Symbol&quot;&gt;&amp;#39;count&lt;/span&gt; &lt;span 
class=&quot;o&quot;&gt;&amp;gt [...]
-
-&lt;p&gt;Tables consist of logical attributes that can be selected by name 
rather than physical Java and Scala data types. This alleviates a lot of 
boilerplate code for common ETL tasks and raises the abstraction for Flink 
programs. Tables are available for both static and streaming data sources 
(DataSet and DataStream APIs).&lt;/p&gt;
-
-&lt;p&gt;&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/libs/table.html&quot;&gt;Check
 out the Table guide for Java and Scala&lt;/a&gt;.&lt;/p&gt;
-
-&lt;h3 id=&quot;gelly-graph-processing-api&quot;&gt;Gelly Graph Processing 
API&lt;/h3&gt;
-
-&lt;p&gt;Gelly is a Java Graph API for Flink. It contains a set of utilities 
for graph analysis, support for iterative graph processing and a library of 
graph algorithms. Gelly exposes a Graph data structure that wraps DataSets for 
vertices and edges, as well as methods for creating graphs from DataSets, graph 
transformations and utilities (e.g., in- and out- degrees of vertices), 
neighborhood aggregations, iterative vertex-centric graph processing, as well 
as a library of common graph a [...]
-
-&lt;p&gt;Gelly internally builds on top of Flink’s&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/apis/iterations.html&quot;&gt;
 delta iterations&lt;/a&gt;. Iterative graph algorithms are executed leveraging 
mutable state, achieving similar performance with specialized graph processing 
systems.&lt;/p&gt;
-
-&lt;p&gt;Gelly will eventually subsume Spargel, Flink’s Pregel-like 
API.&lt;/p&gt;
-
-&lt;p&gt;Note: The Gelly library is still in beta status and subject to 
improvements and heavy performance tuning.&lt;/p&gt;
-
-&lt;p&gt;&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/libs/gelly_guide.html&quot;&gt;Check
 out the Gelly guide&lt;/a&gt;.&lt;/p&gt;
-
-&lt;h3 id=&quot;flink-machine-learning-library&quot;&gt;Flink Machine Learning 
Library&lt;/h3&gt;
-
-&lt;p&gt;This release includes the first version of Flink’s Machine Learning 
library. The library’s pipeline approach, which has been strongly inspired by 
scikit-learn’s abstraction of transformers and predictors, makes it easy to 
quickly set up a data processing pipeline and to get your job done.&lt;/p&gt;
-
-&lt;p&gt;Flink distinguishes between transformers and predictors. Transformers 
are components which transform your input data into a new format allowing you 
to extract features, cleanse your data or to sample from it. Predictors on the 
other hand constitute the components which take your input data and train a 
model on it. The model you obtain from the learner can then be evaluated and 
used to make predictions on unseen data.&lt;/p&gt;
-
-&lt;p&gt;Currently, the machine learning library contains transformers and 
predictors to do multiple tasks. The library supports multiple linear 
regression using stochastic gradient descent to scale to large data sizes. 
Furthermore, it includes an alternating least squares (ALS) implementation to 
factorizes large matrices. The matrix factorization can be used to do 
collaborative filtering. An implementation of the communication efficient 
distributed dual coordinate ascent (CoCoA) algorit [...]
-
-&lt;p&gt;Note: The ML library is still in beta status and subject to 
improvements and heavy performance tuning.&lt;/p&gt;
-
-&lt;p&gt;&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/libs/ml/&quot;&gt;Check
 out FlinkML&lt;/a&gt;&lt;/p&gt;
-
-&lt;h3 id=&quot;flink-on-yarn-leveraging-apache-tez&quot;&gt;Flink on YARN 
leveraging Apache Tez&lt;/h3&gt;
-
-&lt;p&gt;We are introducing a new execution mode for Flink to be able to run 
restricted Flink programs on top of&lt;a 
href=&quot;http://tez.apache.org&quot;&gt; Apache Tez&lt;/a&gt;. This mode 
retains Flink’s APIs, optimizer, as well as Flink’s runtime operators, but 
instead of wrapping those in Flink tasks that are executed by Flink 
TaskManagers, it wraps them in Tez runtime tasks and builds a Tez DAG that 
represents the program.&lt;/p&gt;
-
-&lt;p&gt;By using Flink on Tez, users have an additional choice for an 
execution platform for Flink programs. While Flink’s distributed runtime favors 
low latency, streaming shuffles, and iterative algorithms, Tez focuses on 
scalability and elastic resource usage in shared YARN clusters.&lt;/p&gt;
-
-&lt;p&gt;&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/setup/flink_on_tez.html&quot;&gt;Get
 started with Flink on Tez&lt;/a&gt;.&lt;/p&gt;
-
-&lt;h3 id=&quot;reworked-distributed-runtime-on-akka&quot;&gt;Reworked 
Distributed Runtime on Akka&lt;/h3&gt;
-
-&lt;p&gt;Flink’s RPC system has been replaced by the widely adopted&lt;a 
href=&quot;http://akka.io&quot;&gt; Akka&lt;/a&gt; framework. Akka’s 
concurrency model offers the right abstraction to develop a fast as well as 
robust distributed system. By using Akka’s own failure detection mechanism the 
stability of Flink’s runtime is significantly improved, because the system can 
now react in proper form to node outages. Furthermore, Akka improves Flink’s 
scalability by introducing asynchronous [...]
-
-&lt;h3 id=&quot;improved-yarn-support&quot;&gt;Improved YARN support&lt;/h3&gt;
-
-&lt;p&gt;Flink’s YARN client contains several improvements, such as a detached 
mode for starting a YARN session in the background, the ability to submit a 
single Flink job to a YARN cluster without starting a session, including a 
“fire and forget” mode. Flink is now also able to reallocate failed YARN 
containers to maintain the size of the requested cluster. This feature allows 
to implement fault-tolerant setups on top of YARN. There is also an internal 
Java API to deploy and control a r [...]
-
-&lt;p&gt;&lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/setup/yarn_setup.html&quot;&gt;See
 the YARN docs&lt;/a&gt;.&lt;/p&gt;
-
-&lt;h3 
id=&quot;static-code-analysis-for-the-flink-optimizer-opening-the-udf-blackboxes&quot;&gt;Static
 Code Analysis for the Flink Optimizer: Opening the UDF blackboxes&lt;/h3&gt;
-
-&lt;p&gt;This release introduces a first version of a static code analyzer 
that pre-interprets functions written by the user to get information about the 
function’s internal dataflow. The code analyzer can provide useful information 
about &lt;a 
href=&quot;http://ci.apache.org/projects/flink/flink-docs-release-0.9/apis/programming_guide.html#semantic-annotations&quot;&gt;forwarded
 fields&lt;/a&gt; to Flink’s optimizer and thus speedup job executions. It also 
informs if the code contains o [...]
-
-&lt;p&gt;ExecutionEnvironment.getExecutionConfig().setCodeAnalysisMode(…)&lt;/p&gt;
-
-&lt;p&gt;either as an assistant that gives hints during the implementation or 
by directly applying the optimizations that have been found.&lt;/p&gt;
-
-&lt;h2 id=&quot;more-improvements-and-fixes&quot;&gt;More Improvements and 
Fixes&lt;/h2&gt;
-
-&lt;ul&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1605&quot;&gt;FLINK-1605&lt;/a&gt;:
 Flink is not exposing its Guava and ASM dependencies to Maven projects 
depending on Flink. We use the maven-shade-plugin to relocate these 
dependencies into our own namespace. This allows users to use any Guava or ASM 
version.&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1605&quot;&gt;FLINK-1417&lt;/a&gt;:
 Automatic recognition and registration of Java Types at Kryo and the internal 
serializers: Flink has its own type handling and serialization framework 
falling back to Kryo for types that it cannot handle. To get the best 
performance Flink is automatically registering all types a user is using in 
their program with Kryo.Flink also registers serializers for Protocol Buffers, 
Thrift [...]
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1296&quot;&gt;FLINK-1296&lt;/a&gt;:
 Add support for sorting very large records&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1679&quot;&gt;FLINK-1679&lt;/a&gt;:
 “degreeOfParallelism” methods renamed to “parallelism”&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1501&quot;&gt;FLINK-1501&lt;/a&gt;:
 Add metrics library for monitoring TaskManagers&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1760&quot;&gt;FLINK-1760&lt;/a&gt;:
 Add support for building Flink with Scala 2.11&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1648&quot;&gt;FLINK-1648&lt;/a&gt;:
 Add a mode where the system automatically sets the parallelism to the 
available task slots&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1622&quot;&gt;FLINK-1622&lt;/a&gt;:
 Add groupCombine operator&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1589&quot;&gt;FLINK-1589&lt;/a&gt;:
 Add option to pass Configuration to LocalExecutor&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1504&quot;&gt;FLINK-1504&lt;/a&gt;:
 Add support for accessing secured HDFS clusters in standalone mode&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1478&quot;&gt;FLINK-1478&lt;/a&gt;:
 Add strictly local input split assignment&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1512&quot;&gt;FLINK-1512&lt;/a&gt;:
 Add CsvReader for reading into POJOs.&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1461&quot;&gt;FLINK-1461&lt;/a&gt;:
 Add sortPartition operator&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1450&quot;&gt;FLINK-1450&lt;/a&gt;:
 Add Fold operator to the Streaming api&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1389&quot;&gt;FLINK-1389&lt;/a&gt;:
 Allow setting custom file extensions for files created by the 
FileOutputFormat&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1236&quot;&gt;FLINK-1236&lt;/a&gt;:
 Add support for localization of Hadoop Input Splits&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1179&quot;&gt;FLINK-1179&lt;/a&gt;:
 Add button to JobManager web interface to request stack trace of a 
TaskManager&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1105&quot;&gt;FLINK-1105&lt;/a&gt;:
 Add support for locally sorted output&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1688&quot;&gt;FLINK-1688&lt;/a&gt;:
 Add socket sink&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1436&quot;&gt;FLINK-1436&lt;/a&gt;:
 Improve usability of command line interface&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2174&quot;&gt;FLINK-2174&lt;/a&gt;:
 Allow comments in ‘slaves’ file&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1698&quot;&gt;FLINK-1698&lt;/a&gt;:
 Add polynomial base feature mapper to ML library&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1697&quot;&gt;FLINK-1697&lt;/a&gt;:
 Add alternating least squares algorithm for matrix factorization to ML 
library&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1792&quot;&gt;FLINK-1792&lt;/a&gt;:
 FLINK-456 Improve TM Monitoring: CPU utilization, hide graphs by default and 
show summary only&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1672&quot;&gt;FLINK-1672&lt;/a&gt;:
 Refactor task registration/unregistration&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2001&quot;&gt;FLINK-2001&lt;/a&gt;:
 DistanceMetric cannot be serialized&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1676&quot;&gt;FLINK-1676&lt;/a&gt;:
 enableForceKryo() is not working as expected&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1959&quot;&gt;FLINK-1959&lt;/a&gt;:
 Accumulators BROKEN after Partitioning&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1696&quot;&gt;FLINK-1696&lt;/a&gt;:
 Add multiple linear regression to ML library&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1820&quot;&gt;FLINK-1820&lt;/a&gt;:
 Bug in DoubleParser and FloatParser - empty String is not casted to 0&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1985&quot;&gt;FLINK-1985&lt;/a&gt;:
 Streaming does not correctly forward ExecutionConfig to runtime&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1828&quot;&gt;FLINK-1828&lt;/a&gt;:
 Impossible to output data to an HBase table&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1952&quot;&gt;FLINK-1952&lt;/a&gt;:
 Cannot run ConnectedComponents example: Could not allocate a slot on 
instance&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1848&quot;&gt;FLINK-1848&lt;/a&gt;:
 Paths containing a Windows drive letter cannot be used in 
FileOutputFormats&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1954&quot;&gt;FLINK-1954&lt;/a&gt;:
 Task Failures and Error Handling&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2004&quot;&gt;FLINK-2004&lt;/a&gt;:
 Memory leak in presence of failed checkpoints in KafkaSource&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2132&quot;&gt;FLINK-2132&lt;/a&gt;:
 Java version parsing is not working for OpenJDK&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2098&quot;&gt;FLINK-2098&lt;/a&gt;:
 Checkpoint barrier initiation at source is not aligned with 
snapshotting&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2069&quot;&gt;FLINK-2069&lt;/a&gt;:
 writeAsCSV function in DataStream Scala API creates no file&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2092&quot;&gt;FLINK-2092&lt;/a&gt;:
 Document (new) behavior of print() and execute()&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2177&quot;&gt;FLINK-2177&lt;/a&gt;:
 NullPointer in task resource release&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2054&quot;&gt;FLINK-2054&lt;/a&gt;:
 StreamOperator rework removed copy calls when passing output to a chained 
operator&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2196&quot;&gt;FLINK-2196&lt;/a&gt;:
 Missplaced Class in flink-java SortPartitionOperator&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2191&quot;&gt;FLINK-2191&lt;/a&gt;:
 Inconsistent use of Closure Cleaner in Streaming API&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2206&quot;&gt;FLINK-2206&lt;/a&gt;:
 JobManager webinterface shows 5 finished jobs at most&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-2188&quot;&gt;FLINK-2188&lt;/a&gt;:
 Reading from big HBase Tables&lt;/p&gt;
-  &lt;/li&gt;
-  &lt;li&gt;
-    &lt;p&gt;&lt;a 
href=&quot;https://issues.apache.org/jira/browse/FLINK-1781&quot;&gt;FLINK-1781&lt;/a&gt;:
 Quickstarts broken due to Scala Version Variables&lt;/p&gt;
-  &lt;/li&gt;
-&lt;/ul&gt;
-
-&lt;h2 id=&quot;notice&quot;&gt;Notice&lt;/h2&gt;
-
-&lt;p&gt;The 0.9 series of Flink is the last version to support Java 6. If you 
are still using Java 6, please consider upgrading to Java 8 (Java 7 ended its 
free support in April 2015).&lt;/p&gt;
-
-&lt;p&gt;Flink will require at least Java 7 in major releases after 
0.9.0.&lt;/p&gt;
-</description>
-<pubDate>Wed, 24 Jun 2015 16:00:00 +0200</pubDate>
-<link>https://flink.apache.org/news/2015/06/24/announcing-apache-flink-0.9.0-release.html</link>
-<guid 
isPermaLink="true">/news/2015/06/24/announcing-apache-flink-0.9.0-release.html</guid>
-</item>
-
 </channel>
 </rss>
diff --git a/content/blog/index.html b/content/blog/index.html
index d89dc1c..89c69ec 100644
--- a/content/blog/index.html
+++ b/content/blog/index.html
@@ -196,6 +196,22 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink on Zeppelin 
Notebooks for Interactive Data Analysis - Part 2</a></h2>
+
+      <p>23 Jun 2020
+       Jeff Zhang (<a href="https://twitter.com/zjffdu";>@zjffdu</a>)</p>
+
+      <p><p>In a previous post, we introduced the basics of Flink on Zeppelin 
and how to do Streaming ETL. In this second part of the “Flink on Zeppelin” 
series of posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin.</p>
+
+</p>
+
+      <p><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Continue 
reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on Zeppelin 
Notebooks for Interactive Data Analysis - Part 1</a></h2>
 
       <p>15 Jun 2020
@@ -322,19 +338,6 @@ and provide a tutorial for running Streaming ETL with 
Flink on Zeppelin.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/2020/04/09/pyflink-udf-support-flink.html">PyFlink: Introducing Python 
Support for UDFs in Flink's Table API</a></h2>
-
-      <p>09 Apr 2020
-       Jincheng Sun (<a 
href="https://twitter.com/sunjincheng121";>@sunjincheng121</a>) &amp; Markos 
Sfikas (<a href="https://twitter.com/MarkSfik";>@MarkSfik</a>)</p>
-
-      <p>Flink 1.10 extends its support for Python by adding Python UDFs in 
PyFlink. This post explains how UDFs work in PyFlink and gives some practical 
examples of how to use UDFs in PyFlink.</p>
-
-      <p><a href="/2020/04/09/pyflink-udf-support-flink.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -367,6 +370,16 @@ and provide a tutorial for running Streaming ETL with 
Flink on Zeppelin.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page10/index.html b/content/blog/page10/index.html
index 77fe9a8..730edc2 100644
--- a/content/blog/page10/index.html
+++ b/content/blog/page10/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2016/02/11/release-0.10.2.html">Flink 0.10.2 Released</a></h2>
+
+      <p>11 Feb 2016
+      </p>
+
+      <p><p>Today, the Flink community released Flink version 
<strong>0.10.2</strong>, the second bugfix release of the 0.10 series.</p>
+
+</p>
+
+      <p><a href="/news/2016/02/11/release-0.10.2.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2015/12/18/a-year-in-review.html">Flink 2015: A year in review, and 
a lookout to 2016</a></h2>
 
       <p>18 Dec 2015 by Robert Metzger (<a 
href="https://twitter.com/";>@rmetzger_</a>)
@@ -331,21 +346,6 @@ vertex-centric or gather-sum-apply to Flink dataflows.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing 
Apache Flink 0.9.0</a></h2>
-
-      <p>24 Jun 2015
-      </p>
-
-      <p><p>The Apache Flink community is pleased to announce the availability 
of the 0.9.0 release. The release is the result of many months of hard work 
within the Flink community. It contains many new features and improvements 
which were previewed in the 0.9.0-milestone1 release and have been polished 
since then. This is the largest Flink release so far.</p>
-
-</p>
-
-      <p><a 
href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Continue 
reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -378,6 +378,16 @@ vertex-centric or gather-sum-apply to Flink dataflows.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page11/index.html b/content/blog/page11/index.html
index 2ee6290..9f9951f 100644
--- a/content/blog/page11/index.html
+++ b/content/blog/page11/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing 
Apache Flink 0.9.0</a></h2>
+
+      <p>24 Jun 2015
+      </p>
+
+      <p><p>The Apache Flink community is pleased to announce the availability 
of the 0.9.0 release. The release is the result of many months of hard work 
within the Flink community. It contains many new features and improvements 
which were previewed in the 0.9.0-milestone1 release and have been polished 
since then. This is the largest Flink release so far.</p>
+
+</p>
+
+      <p><a 
href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Continue 
reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2015/05/14/Community-update-April.html">April 2015 in the Flink 
community</a></h2>
 
       <p>14 May 2015 by Kostas Tzoumas (<a 
href="https://twitter.com/";>@kostas_tzoumas</a>)
@@ -337,21 +352,6 @@ and offers a new API including definition of flexible 
windows.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2015/01/06/december-in-flink.html">December 2014 in the Flink 
community</a></h2>
-
-      <p>06 Jan 2015
-      </p>
-
-      <p><p>This is the first blog post of a “newsletter” like series where we 
give a summary of the monthly activity in the Flink community. As the Flink 
project grows, this can serve as a “tl;dr” for people that are not following 
the Flink dev and user mailing lists, or those that are simply overwhelmed by 
the traffic.</p>
-
-</p>
-
-      <p><a href="/news/2015/01/06/december-in-flink.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -384,6 +384,16 @@ and offers a new API including definition of flexible 
windows.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page12/index.html b/content/blog/page12/index.html
index e42e317..b89c090 100644
--- a/content/blog/page12/index.html
+++ b/content/blog/page12/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2015/01/06/december-in-flink.html">December 2014 in the Flink 
community</a></h2>
+
+      <p>06 Jan 2015
+      </p>
+
+      <p><p>This is the first blog post of a “newsletter” like series where we 
give a summary of the monthly activity in the Flink community. As the Flink 
project grows, this can serve as a “tl;dr” for people that are not following 
the Flink dev and user mailing lists, or those that are simply overwhelmed by 
the traffic.</p>
+
+</p>
+
+      <p><a href="/news/2015/01/06/december-in-flink.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2014/11/18/hadoop-compatibility.html">Hadoop Compatibility in 
Flink</a></h2>
 
       <p>18 Nov 2014 by Fabian Hüske (<a 
href="https://twitter.com/";>@fhueske</a>)
@@ -305,6 +320,16 @@ academic and open source project that Flink originates 
from.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page2/index.html b/content/blog/page2/index.html
index 21766ce..6ff283d 100644
--- a/content/blog/page2/index.html
+++ b/content/blog/page2/index.html
@@ -196,6 +196,19 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/2020/04/09/pyflink-udf-support-flink.html">PyFlink: Introducing Python 
Support for UDFs in Flink's Table API</a></h2>
+
+      <p>09 Apr 2020
+       Jincheng Sun (<a 
href="https://twitter.com/sunjincheng121";>@sunjincheng121</a>) &amp; Markos 
Sfikas (<a href="https://twitter.com/MarkSfik";>@MarkSfik</a>)</p>
+
+      <p>Flink 1.10 extends its support for Python by adding Python UDFs in 
PyFlink. This post explains how UDFs work in PyFlink and gives some practical 
examples of how to use UDFs in PyFlink.</p>
+
+      <p><a href="/2020/04/09/pyflink-udf-support-flink.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2020/04/07/release-statefun-2.0.0.html">Stateful Functions 2.0 - An 
Event-driven Database on Apache Flink</a></h2>
 
       <p>07 Apr 2020
@@ -321,19 +334,6 @@ This release marks a big milestone: Stateful Functions 2.0 
is not only an API up
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2020/01/29/state-unlocked-interacting-with-state-in-apache-flink.html">State
 Unlocked: Interacting with State in Apache Flink</a></h2>
-
-      <p>29 Jan 2020
-       Seth Wiesman (<a 
href="https://twitter.com/sjwiesman";>@sjwiesman</a>)</p>
-
-      <p>This post discusses the efforts of the Flink community as they relate 
to state management in Apache Flink. We showcase some practical examples of how 
the different features and APIs can be utilized and cover some future ideas for 
new and improved ways of managing state in Apache Flink.</p>
-
-      <p><a 
href="/news/2020/01/29/state-unlocked-interacting-with-state-in-apache-flink.html">Continue
 reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -366,6 +366,16 @@ This release marks a big milestone: Stateful Functions 2.0 
is not only an API up
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page3/index.html b/content/blog/page3/index.html
index a1808b9..7bc022e 100644
--- a/content/blog/page3/index.html
+++ b/content/blog/page3/index.html
@@ -196,6 +196,19 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2020/01/29/state-unlocked-interacting-with-state-in-apache-flink.html">State
 Unlocked: Interacting with State in Apache Flink</a></h2>
+
+      <p>29 Jan 2020
+       Seth Wiesman (<a 
href="https://twitter.com/sjwiesman";>@sjwiesman</a>)</p>
+
+      <p>This post discusses the efforts of the Flink community as they relate 
to state management in Apache Flink. We showcase some practical examples of how 
the different features and APIs can be utilized and cover some future ideas for 
new and improved ways of managing state in Apache Flink.</p>
+
+      <p><a 
href="/news/2020/01/29/state-unlocked-interacting-with-state-in-apache-flink.html">Continue
 reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2020/01/15/demo-fraud-detection.html">Advanced Flink Application 
Patterns Vol.1: Case Study of a Fraud Detection System</a></h2>
 
       <p>15 Jan 2020
@@ -321,19 +334,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/2019/07/23/flink-network-stack-2.html">Flink Network Stack Vol. 2: 
Monitoring, Metrics, and that Backpressure Thing</a></h2>
-
-      <p>23 Jul 2019
-       Nico Kruber  &amp; Piotr Nowojski </p>
-
-      <p>In a previous blog post, we presented how Flink’s network stack works 
from the high-level abstractions to the low-level details. This second  post 
discusses monitoring network-related metrics to identify backpressure or 
bottlenecks in throughput and latency.</p>
-
-      <p><a href="/2019/07/23/flink-network-stack-2.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -366,6 +366,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page4/index.html b/content/blog/page4/index.html
index 2036202..d13c64f 100644
--- a/content/blog/page4/index.html
+++ b/content/blog/page4/index.html
@@ -196,6 +196,19 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/2019/07/23/flink-network-stack-2.html">Flink Network Stack Vol. 2: 
Monitoring, Metrics, and that Backpressure Thing</a></h2>
+
+      <p>23 Jul 2019
+       Nico Kruber  &amp; Piotr Nowojski </p>
+
+      <p>In a previous blog post, we presented how Flink’s network stack works 
from the high-level abstractions to the low-level details. This second  post 
discusses monitoring network-related metrics to identify backpressure or 
bottlenecks in throughput and latency.</p>
+
+      <p><a href="/2019/07/23/flink-network-stack-2.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2019/07/02/release-1.8.1.html">Apache Flink 1.8.1 Released</a></h2>
 
       <p>02 Jul 2019
@@ -322,19 +335,6 @@ for more details.</p>
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a href="/news/2019/03/06/ffsf-preview.html">What 
to expect from Flink Forward San Francisco 2019</a></h2>
-
-      <p>06 Mar 2019
-       Fabian Hueske (<a href="https://twitter.com/fhueske";>@fhueske</a>)</p>
-
-      <p>The third annual Flink Forward conference in San Francisco is just a 
few weeks away. Let's see what Flink Forward SF 2019 has in store for the 
Apache Flink and stream processing communities. This post covers some of its 
highlights!</p>
-
-      <p><a href="/news/2019/03/06/ffsf-preview.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -367,6 +367,16 @@ for more details.</p>
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page5/index.html b/content/blog/page5/index.html
index a0ad863..180eac4 100644
--- a/content/blog/page5/index.html
+++ b/content/blog/page5/index.html
@@ -196,6 +196,19 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a href="/news/2019/03/06/ffsf-preview.html">What 
to expect from Flink Forward San Francisco 2019</a></h2>
+
+      <p>06 Mar 2019
+       Fabian Hueske (<a href="https://twitter.com/fhueske";>@fhueske</a>)</p>
+
+      <p>The third annual Flink Forward conference in San Francisco is just a 
few weeks away. Let's see what Flink Forward SF 2019 has in store for the 
Apache Flink and stream processing communities. This post covers some of its 
highlights!</p>
+
+      <p><a href="/news/2019/03/06/ffsf-preview.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2019/02/25/monitoring-best-practices.html">Monitoring Apache Flink 
Applications 101</a></h2>
 
       <p>25 Feb 2019
@@ -328,21 +341,6 @@ Please check the <a 
href="https://issues.apache.org/jira/secure/ReleaseNote.jspa
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2018/10/29/release-1.5.5.html">Apache Flink 1.5.5 Released</a></h2>
-
-      <p>29 Oct 2018
-      </p>
-
-      <p><p>The Apache Flink community released the fifth bugfix version of 
the Apache Flink 1.5 series.</p>
-
-</p>
-
-      <p><a href="/news/2018/10/29/release-1.5.5.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -375,6 +373,16 @@ Please check the <a 
href="https://issues.apache.org/jira/secure/ReleaseNote.jspa
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page6/index.html b/content/blog/page6/index.html
index 316b494..a1867f6 100644
--- a/content/blog/page6/index.html
+++ b/content/blog/page6/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2018/10/29/release-1.5.5.html">Apache Flink 1.5.5 Released</a></h2>
+
+      <p>29 Oct 2018
+      </p>
+
+      <p><p>The Apache Flink community released the fifth bugfix version of 
the Apache Flink 1.5 series.</p>
+
+</p>
+
+      <p><a href="/news/2018/10/29/release-1.5.5.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2018/09/20/release-1.6.1.html">Apache Flink 1.6.1 Released</a></h2>
 
       <p>20 Sep 2018
@@ -330,19 +345,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/features/2018/03/01/end-to-end-exactly-once-apache-flink.html">An 
Overview of End-to-End Exactly-Once Processing in Apache Flink (with Apache 
Kafka, too!)</a></h2>
-
-      <p>01 Mar 2018
-       Piotr Nowojski (<a 
href="https://twitter.com/PiotrNowojski";>@PiotrNowojski</a>) &amp; Mike Winters 
(<a href="https://twitter.com/wints";>@wints</a>)</p>
-
-      <p>Flink 1.4.0 introduced a new feature that makes it possible to build 
end-to-end exactly-once applications with Flink and data sources and sinks that 
support transactions.</p>
-
-      <p><a 
href="/features/2018/03/01/end-to-end-exactly-once-apache-flink.html">Continue 
reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -375,6 +377,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page7/index.html b/content/blog/page7/index.html
index 7896685..5dfe892 100644
--- a/content/blog/page7/index.html
+++ b/content/blog/page7/index.html
@@ -196,6 +196,19 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/features/2018/03/01/end-to-end-exactly-once-apache-flink.html">An 
Overview of End-to-End Exactly-Once Processing in Apache Flink (with Apache 
Kafka, too!)</a></h2>
+
+      <p>01 Mar 2018
+       Piotr Nowojski (<a 
href="https://twitter.com/PiotrNowojski";>@PiotrNowojski</a>) &amp; Mike Winters 
(<a href="https://twitter.com/wints";>@wints</a>)</p>
+
+      <p>Flink 1.4.0 introduced a new feature that makes it possible to build 
end-to-end exactly-once applications with Flink and data sources and sinks that 
support transactions.</p>
+
+      <p><a 
href="/features/2018/03/01/end-to-end-exactly-once-apache-flink.html">Continue 
reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2018/02/15/release-1.4.1.html">Apache Flink 1.4.1 Released</a></h2>
 
       <p>15 Feb 2018
@@ -329,21 +342,6 @@ what’s coming in Flink 1.4.0 as well as a preview of what 
the Flink community
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2017/05/16/official-docker-image.html">Introducing Docker Images 
for Apache Flink</a></h2>
-
-      <p>16 May 2017 by Patrick Lucas (Data Artisans) and Ismaël Mejía 
(Talend) (<a href="https://twitter.com/";>@iemejia</a>)
-      </p>
-
-      <p><p>For some time, the Apache Flink community has provided scripts to 
build a Docker image to run Flink. Now, starting with version 1.2.1, Flink will 
have a <a href="https://hub.docker.com/r/_/flink/";>Docker image</a> on the 
Docker Hub. This image is maintained by the Flink community and curated by the 
<a href="https://github.com/docker-library/official-images";>Docker</a> team to 
ensure it meets the quality standards for container images of the Docker 
community.</p>
-
-</p>
-
-      <p><a href="/news/2017/05/16/official-docker-image.html">Continue 
reading &raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -376,6 +374,16 @@ what’s coming in Flink 1.4.0 as well as a preview of what 
the Flink community
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page8/index.html b/content/blog/page8/index.html
index 226bafe..98d2217 100644
--- a/content/blog/page8/index.html
+++ b/content/blog/page8/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2017/05/16/official-docker-image.html">Introducing Docker Images 
for Apache Flink</a></h2>
+
+      <p>16 May 2017 by Patrick Lucas (Data Artisans) and Ismaël Mejía 
(Talend) (<a href="https://twitter.com/";>@iemejia</a>)
+      </p>
+
+      <p><p>For some time, the Apache Flink community has provided scripts to 
build a Docker image to run Flink. Now, starting with version 1.2.1, Flink will 
have a <a href="https://hub.docker.com/r/_/flink/";>Docker image</a> on the 
Docker Hub. This image is maintained by the Flink community and curated by the 
<a href="https://github.com/docker-library/official-images";>Docker</a> team to 
ensure it meets the quality standards for container images of the Docker 
community.</p>
+
+</p>
+
+      <p><a href="/news/2017/05/16/official-docker-image.html">Continue 
reading &raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2017/04/26/release-1.2.1.html">Apache Flink 1.2.1 Released</a></h2>
 
       <p>26 Apr 2017
@@ -323,21 +338,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2016/08/24/ff16-keynotes-panels.html">Flink Forward 2016: 
Announcing Schedule, Keynotes, and Panel Discussion</a></h2>
-
-      <p>24 Aug 2016
-      </p>
-
-      <p><p>An update for the Flink community: the <a 
href="http://flink-forward.org/kb_day/day-1/";>Flink Forward 2016 schedule</a> 
is now available online. This year's event will include 2 days of talks from 
stream processing experts at Google, MapR, Alibaba, Netflix, Cloudera, and 
more. Following the talks is a full day of hands-on Flink training.</p>
-
-</p>
-
-      <p><a href="/news/2016/08/24/ff16-keynotes-panels.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -370,6 +370,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/blog/page9/index.html b/content/blog/page9/index.html
index 6f99fcc..f4d3ef0 100644
--- a/content/blog/page9/index.html
+++ b/content/blog/page9/index.html
@@ -196,6 +196,21 @@
     <!-- Blog posts -->
     
     <article>
+      <h2 class="blog-title"><a 
href="/news/2016/08/24/ff16-keynotes-panels.html">Flink Forward 2016: 
Announcing Schedule, Keynotes, and Panel Discussion</a></h2>
+
+      <p>24 Aug 2016
+      </p>
+
+      <p><p>An update for the Flink community: the <a 
href="http://flink-forward.org/kb_day/day-1/";>Flink Forward 2016 schedule</a> 
is now available online. This year's event will include 2 days of talks from 
stream processing experts at Google, MapR, Alibaba, Netflix, Cloudera, and 
more. Following the talks is a full day of hands-on Flink training.</p>
+
+</p>
+
+      <p><a href="/news/2016/08/24/ff16-keynotes-panels.html">Continue reading 
&raquo;</a></p>
+    </article>
+
+    <hr>
+    
+    <article>
       <h2 class="blog-title"><a 
href="/news/2016/08/11/release-1.1.1.html">Flink 1.1.1 Released</a></h2>
 
       <p>11 Aug 2016
@@ -327,21 +342,6 @@
 
     <hr>
     
-    <article>
-      <h2 class="blog-title"><a 
href="/news/2016/02/11/release-0.10.2.html">Flink 0.10.2 Released</a></h2>
-
-      <p>11 Feb 2016
-      </p>
-
-      <p><p>Today, the Flink community released Flink version 
<strong>0.10.2</strong>, the second bugfix release of the 0.10 series.</p>
-
-</p>
-
-      <p><a href="/news/2016/02/11/release-0.10.2.html">Continue reading 
&raquo;</a></p>
-    </article>
-
-    <hr>
-    
 
     <!-- Pagination links -->
     
@@ -374,6 +374,16 @@
 
     <ul id="markdown-toc">
       
+      <li><a href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink 
on Zeppelin Notebooks for Interactive Data Analysis - Part 2</a></li>
+
+      
+        
+      
+    
+      
+      
+
+      
       <li><a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></li>
 
       
diff --git a/content/ecosystem/2020/06/23/flink-on-zeppelin-part2.html 
b/content/ecosystem/2020/06/23/flink-on-zeppelin-part2.html
new file mode 100644
index 0000000..2957465
--- /dev/null
+++ b/content/ecosystem/2020/06/23/flink-on-zeppelin-part2.html
@@ -0,0 +1,365 @@
+<!DOCTYPE html>
+<html lang="en">
+  <head>
+    <meta charset="utf-8">
+    <meta http-equiv="X-UA-Compatible" content="IE=edge">
+    <meta name="viewport" content="width=device-width, initial-scale=1">
+    <!-- The above 3 meta tags *must* come first in the head; any other head 
content must come *after* these tags -->
+    <title>Apache Flink: Flink on Zeppelin Notebooks for Interactive Data 
Analysis - Part 2</title>
+    <link rel="shortcut icon" href="/favicon.ico" type="image/x-icon">
+    <link rel="icon" href="/favicon.ico" type="image/x-icon">
+
+    <!-- Bootstrap -->
+    <link rel="stylesheet" 
href="https://maxcdn.bootstrapcdn.com/bootstrap/3.4.1/css/bootstrap.min.css";>
+    <link rel="stylesheet" href="/css/flink.css">
+    <link rel="stylesheet" href="/css/syntax.css">
+
+    <!-- Blog RSS feed -->
+    <link href="/blog/feed.xml" rel="alternate" type="application/rss+xml" 
title="Apache Flink Blog: RSS feed" />
+
+    <!-- jQuery (necessary for Bootstrap's JavaScript plugins) -->
+    <!-- We need to load Jquery in the header for custom google analytics 
event tracking-->
+    <script src="/js/jquery.min.js"></script>
+
+    <!-- HTML5 shim and Respond.js for IE8 support of HTML5 elements and media 
queries -->
+    <!-- WARNING: Respond.js doesn't work if you view the page via file:// -->
+    <!--[if lt IE 9]>
+      <script 
src="https://oss.maxcdn.com/html5shiv/3.7.2/html5shiv.min.js";></script>
+      <script 
src="https://oss.maxcdn.com/respond/1.4.2/respond.min.js";></script>
+    <![endif]-->
+  </head>
+  <body>  
+    
+
+    <!-- Main content. -->
+    <div class="container">
+    <div class="row">
+
+      
+     <div id="sidebar" class="col-sm-3">
+        
+
+<!-- Top navbar. -->
+    <nav class="navbar navbar-default">
+        <!-- The logo. -->
+        <div class="navbar-header">
+          <button type="button" class="navbar-toggle collapsed" 
data-toggle="collapse" data-target="#bs-example-navbar-collapse-1">
+            <span class="icon-bar"></span>
+            <span class="icon-bar"></span>
+            <span class="icon-bar"></span>
+          </button>
+          <div class="navbar-logo">
+            <a href="/">
+              <img alt="Apache Flink" src="/img/flink-header-logo.svg" 
width="147px" height="73px">
+            </a>
+          </div>
+        </div><!-- /.navbar-header -->
+
+        <!-- The navigation links. -->
+        <div class="collapse navbar-collapse" 
id="bs-example-navbar-collapse-1">
+          <ul class="nav navbar-nav navbar-main">
+
+            <!-- First menu section explains visitors what Flink is -->
+
+            <!-- What is Stream Processing? -->
+            <!--
+            <li><a href="/streamprocessing1.html">What is Stream 
Processing?</a></li>
+            -->
+
+            <!-- What is Flink? -->
+            <li><a href="/flink-architecture.html">What is Apache 
Flink?</a></li>
+
+            
+            <ul class="nav navbar-nav navbar-subnav">
+              <li >
+                  <a href="/flink-architecture.html">Architecture</a>
+              </li>
+              <li >
+                  <a href="/flink-applications.html">Applications</a>
+              </li>
+              <li >
+                  <a href="/flink-operations.html">Operations</a>
+              </li>
+            </ul>
+            
+
+            <!-- What is Stateful Functions? -->
+
+            <li><a href="/stateful-functions.html">What is Stateful 
Functions?</a></li>
+
+            <!-- Use cases -->
+            <li><a href="/usecases.html">Use Cases</a></li>
+
+            <!-- Powered by -->
+            <li><a href="/poweredby.html">Powered By</a></li>
+
+
+            &nbsp;
+            <!-- Second menu section aims to support Flink users -->
+
+            <!-- Downloads -->
+            <li><a href="/downloads.html">Downloads</a></li>
+
+            <!-- Getting Started -->
+            <li class="dropdown">
+              <a class="dropdown-toggle" data-toggle="dropdown" 
href="#">Getting Started<span class="caret"></span></a>
+              <ul class="dropdown-menu">
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-docs-release-1.10/getting-started/index.html";
 target="_blank">With Flink <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1/getting-started/project-setup.html";
 target="_blank">With Flink Stateful Functions <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+                <li><a href="/training.html">Training Course</a></li>
+              </ul>
+            </li>
+
+            <!-- Documentation -->
+            <li class="dropdown">
+              <a class="dropdown-toggle" data-toggle="dropdown" 
href="#">Documentation<span class="caret"></span></a>
+              <ul class="dropdown-menu">
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-docs-release-1.10"; 
target="_blank">Flink 1.10 (Latest stable release) <small><span 
class="glyphicon glyphicon-new-window"></span></small></a></li>
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-docs-master"; 
target="_blank">Flink Master (Latest Snapshot) <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.1"; 
target="_blank">Flink Stateful Functions 2.1 (Latest stable release) 
<small><span class="glyphicon glyphicon-new-window"></span></small></a></li>
+                <li><a 
href="https://ci.apache.org/projects/flink/flink-statefun-docs-master"; 
target="_blank">Flink Stateful Functions Master (Latest Snapshot) <small><span 
class="glyphicon glyphicon-new-window"></span></small></a></li>
+              </ul>
+            </li>
+
+            <!-- getting help -->
+            <li><a href="/gettinghelp.html">Getting Help</a></li>
+
+            <!-- Blog -->
+            <li><a href="/blog/"><b>Flink Blog</b></a></li>
+
+
+            <!-- Flink-packages -->
+            <li>
+              <a href="https://flink-packages.org"; 
target="_blank">flink-packages.org <small><span class="glyphicon 
glyphicon-new-window"></span></small></a>
+            </li>
+            &nbsp;
+
+            <!-- Third menu section aim to support community and contributors 
-->
+
+            <!-- Community -->
+            <li><a href="/community.html">Community &amp; Project Info</a></li>
+
+            <!-- Roadmap -->
+            <li><a href="/roadmap.html">Roadmap</a></li>
+
+            <!-- Contribute -->
+            <li><a href="/contributing/how-to-contribute.html">How to 
Contribute</a></li>
+            
+
+            <!-- GitHub -->
+            <li>
+              <a href="https://github.com/apache/flink"; target="_blank">Flink 
on GitHub <small><span class="glyphicon 
glyphicon-new-window"></span></small></a>
+            </li>
+
+            &nbsp;
+
+            <!-- Language Switcher -->
+            <li>
+              
+                
+                  <a 
href="/zh/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">中文版</a>
+                
+              
+            </li>
+
+          </ul>
+
+          <ul class="nav navbar-nav navbar-bottom">
+          <hr />
+
+            <!-- Twitter -->
+            <li><a href="https://twitter.com/apacheflink"; 
target="_blank">@ApacheFlink <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+
+            <!-- Visualizer -->
+            <li class=" hidden-md hidden-sm"><a href="/visualizer/" 
target="_blank">Plan Visualizer <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+
+          <hr />
+
+            <li><a href="https://apache.org"; target="_blank">Apache Software 
Foundation <small><span class="glyphicon 
glyphicon-new-window"></span></small></a></li>
+
+            <li>
+              <style>
+                .smalllinks:link {
+                  display: inline-block !important; background: none; 
padding-top: 0px; padding-bottom: 0px; padding-right: 0px; min-width: 75px;
+                }
+              </style>
+
+              <a class="smalllinks" href="https://www.apache.org/licenses/"; 
target="_blank">License</a> <small><span class="glyphicon 
glyphicon-new-window"></span></small>
+
+              <a class="smalllinks" href="https://www.apache.org/security/"; 
target="_blank">Security</a> <small><span class="glyphicon 
glyphicon-new-window"></span></small>
+
+              <a class="smalllinks" 
href="https://www.apache.org/foundation/sponsorship.html"; 
target="_blank">Donate</a> <small><span class="glyphicon 
glyphicon-new-window"></span></small>
+
+              <a class="smalllinks" 
href="https://www.apache.org/foundation/thanks.html"; target="_blank">Thanks</a> 
<small><span class="glyphicon glyphicon-new-window"></span></small>
+            </li>
+
+          </ul>
+        </div><!-- /.navbar-collapse -->
+    </nav>
+
+      </div>
+      <div class="col-sm-9">
+      <div class="row-fluid">
+  <div class="col-sm-12">
+    <div class="row">
+      <h1>Flink on Zeppelin Notebooks for Interactive Data Analysis - Part 
2</h1>
+      <p><i></i></p>
+
+      <article>
+        <p>23 Jun 2020 Jeff Zhang (<a 
href="https://twitter.com/zjffdu";>@zjffdu</a>)</p>
+
+<p>In a previous post, we introduced the basics of Flink on Zeppelin and how 
to do Streaming ETL. In this second part of the “Flink on Zeppelin” series of 
posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin.</p>
+
+<h1 id="streaming-data-visualization">Streaming Data Visualization</h1>
+
+<p>With <a href="https://zeppelin.apache.org/";>Zeppelin</a>, you can build a 
real time streaming dashboard without writing any line of javascript/html/css 
code.</p>
+
+<p>Overall, Zeppelin supports 3 kinds of streaming data analytics:</p>
+
+<ul>
+  <li>Single Mode</li>
+  <li>Update Mode</li>
+  <li>Append Mode</li>
+</ul>
+
+<h3 id="single-mode">Single Mode</h3>
+<p>Single mode is used for cases when the result of a SQL statement is always 
one row, such as the following example. 
+The output format is translated in HTML, and you can specify a paragraph local 
property template for the final output content template. 
+And you can use <code>{i}</code> as placeholder for the {i}th column of the 
result.</p>
+
+<center>
+<img src="/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif" 
width="80%" alt="Single Mode" />
+</center>
+
+<h3 id="update-mode">Update Mode</h3>
+<p>Update mode is suitable for the cases when the output format is more than 
one row, 
+and will always be continuously updated. Here’s one example where we use 
<code>GROUP BY</code>.</p>
+
+<center>
+<img src="/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif" 
width="80%" alt="Update Mode" />
+</center>
+
+<h3 id="append-mode">Append Mode</h3>
+<p>Append mode is suitable for the cases when the output data is always 
appended. 
+For instance, the example below uses a tumble window.</p>
+
+<center>
+<img src="/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif" 
width="80%" alt="Append Mode" />
+</center>
+
+<h1 id="udf">UDF</h1>
+
+<p>SQL is a very powerful language, especially in expressing data flow. But 
most of the time, you need to handle complicated business logic that cannot be 
expressed by SQL.
+In these cases UDFs (user-defined functions) come particularly handy. In 
Zeppelin, you can write Scala or Python UDFs, while you can also import Scala, 
Python and Java UDFs.
+Here are 2 examples of Scala and Python UDFs:</p>
+
+<ul>
+  <li>Scala UDF</li>
+</ul>
+
+<div class="highlight"><pre><code class="language-scala"><span 
class="o">%</span><span class="n">flink</span>
+
+<span class="k">class</span> <span class="nc">ScalaUpper</span> <span 
class="k">extends</span> <span class="nc">ScalarFunction</span> <span 
class="o">{</span>
+<span class="k">def</span> <span class="n">eval</span><span 
class="o">(</span><span class="n">str</span><span class="k">:</span> <span 
class="kt">String</span><span class="o">)</span> <span class="k">=</span> <span 
class="n">str</span><span class="o">.</span><span class="n">toUpperCase</span>
+<span class="o">}</span>
+<span class="n">btenv</span><span class="o">.</span><span 
class="n">registerFunction</span><span class="o">(</span><span 
class="s">&quot;scala_upper&quot;</span><span class="o">,</span> <span 
class="k">new</span> <span class="nc">ScalaUpper</span><span 
class="o">())</span></code></pre></div>
+
+<ul>
+  <li>Python UDF</li>
+</ul>
+
+<div class="highlight"><pre><code class="language-python"><span 
class="o">%</span><span class="n">flink</span><span class="o">.</span><span 
class="n">pyflink</span>
+
+<span class="k">class</span> <span class="nc">PythonUpper</span><span 
class="p">(</span><span class="n">ScalarFunction</span><span class="p">):</span>
+<span class="k">def</span> <span class="nf">eval</span><span 
class="p">(</span><span class="bp">self</span><span class="p">,</span> <span 
class="n">s</span><span class="p">):</span>
+ <span class="k">return</span> <span class="n">s</span><span 
class="o">.</span><span class="n">upper</span><span class="p">()</span>
+
+<span class="n">bt_env</span><span class="o">.</span><span 
class="n">register_function</span><span class="p">(</span><span 
class="s">&quot;python_upper&quot;</span><span class="p">,</span> <span 
class="n">udf</span><span class="p">(</span><span 
class="n">PythonUpper</span><span class="p">(),</span> <span 
class="n">DataTypes</span><span class="o">.</span><span 
class="n">STRING</span><span class="p">(),</span> <span 
class="n">DataTypes</span><span class="o">.</span><span class="n">STRING</ [...]
+
+<p>After you define the UDFs, you can use them directly in SQL:</p>
+
+<ul>
+  <li>Use Scala UDF in SQL</li>
+</ul>
+
+<center>
+<img src="/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png" 
width="100%" alt="Scala UDF" />
+</center>
+
+<ul>
+  <li>Use Python UDF in SQL</li>
+</ul>
+
+<center>
+<img src="/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png" 
width="100%" alt="Python UDF" />
+</center>
+
+<h1 id="summary">Summary</h1>
+
+<p>In this post, we explained how to perform streaming data visualization via 
Flink on Zeppelin and how to use UDFs. 
+Besides that, you can do more in Zeppelin with Flink, such as batch 
processing, Hive integration and more.
+You can check the following articles for more details and here’s a list of <a 
href="https://www.youtube.com/watch?v=YxPo0Fosjjg&amp;list=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX";>Flink
 on Zeppelin tutorial videos</a> for your reference.</p>
+
+<h1 id="references">References</h1>
+
+<ul>
+  <li><a href="http://zeppelin.apache.org";>Apache Zeppelin official 
website</a></li>
+  <li>Flink on Zeppelin tutorials - <a 
href="https://medium.com/@zjffdu/flink-on-zeppelin-part-1-get-started-2591aaa6aa47";>Part
 1</a></li>
+  <li>Flink on Zeppelin tutorials - <a 
href="https://medium.com/@zjffdu/flink-on-zeppelin-part-2-batch-711731df5ad9";>Part
 2</a></li>
+  <li>Flink on Zeppelin tutorials - <a 
href="https://medium.com/@zjffdu/flink-on-zeppelin-part-3-streaming-5fca1e16754";>Part
 3</a></li>
+  <li>Flink on Zeppelin tutorials - <a 
href="https://medium.com/@zjffdu/flink-on-zeppelin-part-4-advanced-usage-998b74908cd9";>Part
 4</a></li>
+  <li><a 
href="https://www.youtube.com/watch?v=YxPo0Fosjjg&amp;list=PL4oy12nnS7FFtg3KV1iS5vDb0pTz12VcX";>Flink
 on Zeppelin tutorial videos</a></li>
+</ul>
+
+      </article>
+    </div>
+
+    <div class="row">
+      <div id="disqus_thread"></div>
+      <script type="text/javascript">
+        /* * * CONFIGURATION VARIABLES: EDIT BEFORE PASTING INTO YOUR WEBPAGE 
* * */
+        var disqus_shortname = 'stratosphere-eu'; // required: replace example 
with your forum shortname
+
+        /* * * DON'T EDIT BELOW THIS LINE * * */
+        (function() {
+            var dsq = document.createElement('script'); dsq.type = 
'text/javascript'; dsq.async = true;
+            dsq.src = '//' + disqus_shortname + '.disqus.com/embed.js';
+             (document.getElementsByTagName('head')[0] || 
document.getElementsByTagName('body')[0]).appendChild(dsq);
+        })();
+      </script>
+    </div>
+  </div>
+</div>
+      </div>
+    </div>
+
+    <hr />
+
+    <div class="row">
+      <div class="footer text-center col-sm-12">
+        <p>Copyright © 2014-2019 <a href="http://apache.org";>The Apache 
Software Foundation</a>. All Rights Reserved.</p>
+        <p>Apache Flink, Flink®, Apache®, the squirrel logo, and the Apache 
feather logo are either registered trademarks or trademarks of The Apache 
Software Foundation.</p>
+        <p><a href="/privacy-policy.html">Privacy Policy</a> &middot; <a 
href="/blog/feed.xml">RSS feed</a></p>
+      </div>
+    </div>
+    </div><!-- /.container -->
+
+    <!-- Include all compiled plugins (below), or include individual files as 
needed -->
+    <script 
src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.4/js/bootstrap.min.js";></script>
+    <script 
src="https://cdnjs.cloudflare.com/ajax/libs/jquery.matchHeight/0.7.0/jquery.matchHeight-min.js";></script>
+    <script src="/js/codetabs.js"></script>
+    <script src="/js/stickysidebar.js"></script>
+
+    <!-- Google Analytics -->
+    <script>
+      
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
+      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new 
Date();a=s.createElement(o),
+      
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
+      
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
+
+      ga('create', 'UA-52545728-1', 'auto');
+      ga('send', 'pageview');
+    </script>
+  </body>
+</html>
diff --git 
a/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif
new file mode 100644
index 0000000..3c827f4
Binary files /dev/null and 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_append_mode.gif 
differ
diff --git 
a/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png
new file mode 100644
index 0000000..e4caaf5
Binary files /dev/null and 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_python_udf.png 
differ
diff --git 
a/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png
new file mode 100644
index 0000000..4448ad1
Binary files /dev/null and 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_scala_udf.png differ
diff --git 
a/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif
new file mode 100644
index 0000000..91b49ed
Binary files /dev/null and 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_single_mode.gif 
differ
diff --git 
a/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif
new file mode 100644
index 0000000..fe7e2e9
Binary files /dev/null and 
b/content/img/blog/2020-06-23-flink-on-zeppelin-part2/flink_update_mode.gif 
differ
diff --git a/content/index.html b/content/index.html
index a686f3c..5c816f9 100644
--- a/content/index.html
+++ b/content/index.html
@@ -568,6 +568,12 @@
 
   <dl>
       
+        <dt> <a 
href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink on Zeppelin 
Notebooks for Interactive Data Analysis - Part 2</a></dt>
+        <dd><p>In a previous post, we introduced the basics of Flink on 
Zeppelin and how to do Streaming ETL. In this second part of the “Flink on 
Zeppelin” series of posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin.</p>
+
+</dd>
+      
         <dt> <a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></dt>
         <dd><p>The latest release of <a 
href="https://zeppelin.apache.org/";>Apache Zeppelin</a> comes with a redesigned 
interpreter for Apache Flink (version Flink 1.10+ is only supported moving 
forward) 
 that allows developers to use Flink directly on Zeppelin notebooks for 
interactive data analysis. I wrote 2 posts about how to use Flink in Zeppelin. 
This is part-1 where I explain how the Flink interpreter in Zeppelin works, 
@@ -587,9 +593,6 @@ and provide a tutorial for running Streaming ETL with Flink 
on Zeppelin.</p>
         <dd><p>The Apache Flink community released the first bugfix version of 
the Apache Flink 1.10 series.</p>
 
 </dd>
-      
-        <dt> <a href="/news/2020/05/07/community-update.html">Flink Community 
Update - May'20</a></dt>
-        <dd>Can you smell it? It’s release month! This time around, we’re 
warming up for Flink 1.11 and peeping back to the past month in the Flink 
community — with the release of Stateful Functions 2.0, a new self-paced Flink 
training and some efforts to improve the Flink documentation experience.</dd>
     
   </dl>
 
diff --git a/content/zh/index.html b/content/zh/index.html
index 54ea3b9..f4a3546 100644
--- a/content/zh/index.html
+++ b/content/zh/index.html
@@ -565,6 +565,12 @@
 
   <dl>
       
+        <dt> <a 
href="/ecosystem/2020/06/23/flink-on-zeppelin-part2.html">Flink on Zeppelin 
Notebooks for Interactive Data Analysis - Part 2</a></dt>
+        <dd><p>In a previous post, we introduced the basics of Flink on 
Zeppelin and how to do Streaming ETL. In this second part of the “Flink on 
Zeppelin” series of posts, I will share how to 
+perform streaming data visualization via Flink on Zeppelin and how to use 
Apache Flink UDFs in Zeppelin.</p>
+
+</dd>
+      
         <dt> <a href="/news/2020/06/15/flink-on-zeppelin-part1.html">Flink on 
Zeppelin Notebooks for Interactive Data Analysis - Part 1</a></dt>
         <dd><p>The latest release of <a 
href="https://zeppelin.apache.org/";>Apache Zeppelin</a> comes with a redesigned 
interpreter for Apache Flink (version Flink 1.10+ is only supported moving 
forward) 
 that allows developers to use Flink directly on Zeppelin notebooks for 
interactive data analysis. I wrote 2 posts about how to use Flink in Zeppelin. 
This is part-1 where I explain how the Flink interpreter in Zeppelin works, 
@@ -584,9 +590,6 @@ and provide a tutorial for running Streaming ETL with Flink 
on Zeppelin.</p>
         <dd><p>The Apache Flink community released the first bugfix version of 
the Apache Flink 1.10 series.</p>
 
 </dd>
-      
-        <dt> <a href="/news/2020/05/07/community-update.html">Flink Community 
Update - May'20</a></dt>
-        <dd>Can you smell it? It’s release month! This time around, we’re 
warming up for Flink 1.11 and peeping back to the past month in the Flink 
community — with the release of Stateful Functions 2.0, a new self-paced Flink 
training and some efforts to improve the Flink documentation experience.</dd>
     
   </dl>
 

Reply via email to