shuiqiangchen commented on a change in pull request #13230:
URL: https://github.com/apache/flink/pull/13230#discussion_r482670348



##########
File path: docs/dev/python/user-guide/datastream/operators.md
##########
@@ -0,0 +1,87 @@
+---
+title: "Operators"
+nav-parent_id: python_datastream_api
+nav-pos: 20
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+Operators transform one or more DataStreams into a new DataStream. Programs 
can combine multiple transformations into 
+sophisticated dataflow topologies.
+
+This section give a description of the basic transformations Python DataStream 
API provides, the effective physical 
+partitioning after applying those as well as insights into Flink's operator 
chaining.
+
+* This will be replaced by the TOC
+{:toc}
+
+# DataStream Transformations
+
+DataStream programs in Flink are regular programs that implement 
transformations on data streams (e.g., mapping, 
+filtering, reducing). Please see [operators]({% link 
dev/stream/operators/index.md %}
+?code_tab=python) for an overview of the available stream transformations in 
Python DataStream API.
+
+# Functions
+Most operators require a user-defined function. The following will describe 
different ways of how they can be specified.
+
+## Implementing Function Interfaces
+Function interfaces for different operations are provided in Python DataStream 
API. Users can implement a Function and 
+pass it to the corresponding operation. Take MapFunction for instance:
+<p>
+{% highlight python %}
+# Implement a MapFunction that returns plus one value of input value.
+class MyMapFunction(MapFunction):
+    
+    def map(value):
+        return value + 1
+        
+data_stream = env.from_collection([1, 2, 3, 4, 5],type_info=Types.INT())
+mapped_stream = data_stream.map(MyMapFunction(), output_type=Types.INT())
+{% endhighlight %}
+</p>
+<span class="label label-info">Note</span> In Python DataStream API, users are 
able to defined the output type information of the operation. If not 
+defined, the output type will be `Types.PICKLED_BYTE_ARRAY` so that data will 
be in a form of byte array generated by 
+pickle seriallizer. For more details about the `Pickle Serialization`, please 
refer to [DataTypes]({% link dev/python/user-guide/datastream/data_types.md
+ %}#pickle-serialization).
+
+## Lambda Functions
+As shown in previous examples, all operations can also accept a lambda 
function for describing the operation:

Review comment:
       The 'previous examples' means the example in index page, here is a more 
concrete illustration.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to