[
https://issues.apache.org/jira/browse/APEXMALHAR-2006?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15289435#comment-15289435
]
ASF GitHub Bot commented on APEXMALHAR-2006:
--------------------------------------------
Github user siyuanh commented on a diff in the pull request:
https://github.com/apache/incubator-apex-malhar/pull/261#discussion_r63750446
--- Diff:
stream/src/test/java/org/apache/apex/malhar/stream/sample/MyStreamTest.java ---
@@ -0,0 +1,121 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.apex.malhar.stream.sample;
+
+import java.util.Arrays;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+import org.junit.Assert;
+import org.junit.Test;
+
+import org.apache.apex.malhar.stream.api.function.Function;
+import org.apache.apex.malhar.stream.api.impl.StreamFactory;
+
+/**
+ * A test class which test your own stream implementation build on default
one
+ */
+public class MyStreamTest
+{
+ static Map<Object, Integer> expected = new HashMap<>();
+ static {
+ expected.put("newword1", 4);
+ expected.put("newword2", 8);
+ expected.put("newword3", 4);
+ expected.put("newword4", 4);
+ expected.put("newword5", 4);
+ expected.put("newword7", 4);
+ expected.put("newword9", 6);
+ }
+
+ @Test
+ public void testMethodChainWordcount() throws Exception
+ {
+ TupleCollector<Map<Object, Integer>> collector = new
TupleCollector<>();
+ collector.id = "testMethodChainWordcount";
+ new MyStream<>(StreamFactory.fromFolder("./src/test/resources/data"))
+ .<String, MyStream<String>>flatMap(new
Function.FlatMapFunction<String, String>()
+ {
+ @Override
+ public Iterable<String> f(String input)
+ {
+ return Arrays.asList(input.split(" "));
+ }
+ }).filterAndMap(new Function.MapFunction<String, String>()
+ {
+ @Override
+ public String f(String input)
+ {
+ return input.replace("word", "newword");
+ }
+ }, new Function.FilterFunction<String>()
+ {
+ @Override
+ public Boolean f(String input)
+ {
+ return input.startsWith("word");
+ }
+ }).countByKey()
+ .addOperator(collector, collector.inputPort,
collector.outputPort).print().runEmbedded(5000);
+
+
+ List<Map<Object, Integer>> data = (List<Map<Object,
Integer>>)TupleCollector.results.get("testMethodChainWordcount");
+ Assert.assertTrue(data.size() > 1);
+ Assert.assertEquals(expected, data.get(data.size() - 1));
+ }
+
+ @Test
+ public void testNonMethodChainWordcount() throws Exception
+ {
+ TupleCollector<Map<Object, Integer>> collector = new
TupleCollector<>();
+ collector.id = "testNonMethodChainWordcount";
+ MyStream<String> mystream = new MyStream<>(StreamFactory
+ .fromFolder("./src/test/resources/data"))
+ .flatMap(new Function.FlatMapFunction<String, String>()
+ {
+ @Override
+ public Iterable<String> f(String input)
+ {
+ return Arrays.asList(input.split(" "));
+ }
+ });
+ mystream.filterAndMap(new Function.MapFunction<String, String>()
+ {
+ @Override
+ public String f(String input)
+ {
+ return input.replace("word", "newword");
+ }
+ }, new Function.FilterFunction<String>()
+ {
+ @Override
+ public Boolean f(String input)
+ {
+ return input.startsWith("word");
+ }
+ }).countByKey().addOperator(collector, collector.inputPort,
collector.outputPort).print().runEmbedded(5000);
--- End diff --
The only existing API to shutdown a dag is throw ShutdownException but it
doesn't guarantee the whole dag process all the data?
> Stream API Design
> -----------------
>
> Key: APEXMALHAR-2006
> URL: https://issues.apache.org/jira/browse/APEXMALHAR-2006
> Project: Apache Apex Malhar
> Issue Type: Sub-task
> Reporter: Siyuan Hua
> Assignee: Siyuan Hua
> Fix For: 3.4.0
>
>
> Construct DAG in a similar way as Flink/Spark Streaming
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)