[ 
https://issues.apache.org/jira/browse/BEAM-5758?focusedWorklogId=157585&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-157585
 ]

ASF GitHub Bot logged work on BEAM-5758:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 23/Oct/18 14:43
            Start Date: 23/Oct/18 14:43
    Worklog Time Spent: 10m 
      Work Description: lgajowy closed pull request #6239: [BEAM-5758] Load 
tests of Python Synthetic Sources
URL: https://github.com/apache/beam/pull/6239
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/sdks/python/apache_beam/testing/load_tests/__init__.py 
b/sdks/python/apache_beam/testing/load_tests/__init__.py
new file mode 100644
index 00000000000..cce3acad34a
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/__init__.py
@@ -0,0 +1,16 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
diff --git a/sdks/python/apache_beam/testing/load_tests/co_group_by_key_test.py 
b/sdks/python/apache_beam/testing/load_tests/co_group_by_key_test.py
new file mode 100644
index 00000000000..6956f04531b
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/co_group_by_key_test.py
@@ -0,0 +1,148 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+"""
+To run test on DirectRunner
+
+python setup.py nosetests \
+    --test-pipeline-options="--input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\":0}'
+        --co_input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\":0}'" \
+    --tests apache_beam.testing.load_tests.co_group_by_it_test
+
+To run test on other runner (ex. Dataflow):
+
+python setup.py nosetests \
+    --test-pipeline-options="
+        --runner=TestDataflowRunner
+        --project=...
+        --staging_location=gs://...
+        --temp_location=gs://...
+        --sdk_location=./dist/apache-beam-x.x.x.dev0.tar.gz
+        --input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\":0
+        }'
+        --co_input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\":0
+        }'" \
+    --tests apache_beam.testing.load_tests.co_group_by_it_test
+
+"""
+
+from __future__ import absolute_import
+
+import json
+import logging
+import unittest
+
+import apache_beam as beam
+from apache_beam.testing import synthetic_pipeline
+from apache_beam.testing.load_tests.load_test_metrics_utils import MeasureTime
+from apache_beam.testing.test_pipeline import TestPipeline
+
+INPUT_TAG = 'pc1'
+CO_INPUT_TAG = 'pc2'
+
+
+class CoGroupByKeyTest(unittest.TestCase):
+
+  def parseTestPipelineOptions(self, options):
+    return {
+        'numRecords': options.get('num_records'),
+        'keySizeBytes': options.get('key_size'),
+        'valueSizeBytes': options.get('value_size'),
+        'bundleSizeDistribution': {
+            'type': options.get(
+                'bundle_size_distribution_type', 'const'
+            ),
+            'param': options.get('bundle_size_distribution_param', 0)
+        },
+        'forceNumInitialBundles': options.get(
+            'force_initial_num_bundles', 0
+        )
+    }
+
+  def setUp(self):
+    self.pipeline = TestPipeline(is_integration_test=True)
+    self.inputOptions = json.loads(self.pipeline.get_option('input_options'))
+    self.coInputOptions = json.loads(
+        self.pipeline.get_option('co_input_options'))
+
+  class _Ungroup(beam.DoFn):
+    def process(self, element):
+      values = element[1]
+      inputs = values.get(INPUT_TAG)
+      co_inputs = values.get(CO_INPUT_TAG)
+      for i in inputs:
+        yield i
+      for i in co_inputs:
+        yield i
+
+  def testCoGroupByKey(self):
+    with self.pipeline as p:
+      pc1 = (p
+             | 'Read ' + INPUT_TAG >> beam.io.Read(
+                 synthetic_pipeline.SyntheticSource(
+                     self.parseTestPipelineOptions(self.inputOptions)))
+             | 'Make ' + INPUT_TAG + ' iterable' >> beam.Map(lambda x: (x, x))
+            )
+
+      pc2 = (p
+             | 'Read ' + CO_INPUT_TAG >> beam.io.Read(
+                 synthetic_pipeline.SyntheticSource(
+                     self.parseTestPipelineOptions(self.coInputOptions)))
+             | 'Make ' + CO_INPUT_TAG + ' iterable' >> beam.Map(
+                 lambda x: (x, x))
+            )
+      # pylint: disable=expression-not-assigned
+      ({INPUT_TAG: pc1, CO_INPUT_TAG: pc2}
+       | 'CoGroupByKey: ' >> beam.CoGroupByKey()
+       | 'Consume Joined Collections' >> beam.ParDo(self._Ungroup())
+       | 'Measure time' >> beam.ParDo(MeasureTime())
+      )
+
+      result = p.run()
+      result.wait_until_finish()
+      metrics = result.metrics().query()
+
+      for dist in metrics['distributions']:
+        logging.info("Distribution: %s", dist)
+
+
+if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.INFO)
+  unittest.main()
diff --git a/sdks/python/apache_beam/testing/load_tests/combine_test.py 
b/sdks/python/apache_beam/testing/load_tests/combine_test.py
new file mode 100644
index 00000000000..187a0465710
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/combine_test.py
@@ -0,0 +1,111 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+"""
+To run test on DirectRunner
+
+python setup.py nosetests \
+    --test-pipeline-options="
+    --input_options='{
+    \"num_records\": 300,
+    \"key_size\": 5,
+    \"value_size\":15,
+    \"bundle_size_distribution_type\": \"const\",
+    \"bundle_size_distribution_param\": 1,
+    \"force_initial_num_bundles\": 0
+    }'" \
+    --tests apache_beam.testing.load_tests.combine_test
+
+To run test on other runner (ex. Dataflow):
+
+python setup.py nosetests \
+    --test-pipeline-options="
+        --runner=TestDataflowRunner
+        --project=...
+        --staging_location=gs://...
+        --temp_location=gs://...
+        --sdk_location=./dist/apache-beam-x.x.x.dev0.tar.gz
+        --input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\": 0
+        }'" \
+    --tests apache_beam.testing.load_tests.combine_test
+
+"""
+
+from __future__ import absolute_import
+
+import json
+import logging
+import unittest
+
+import apache_beam as beam
+from apache_beam.testing import synthetic_pipeline
+from apache_beam.testing.load_tests.load_test_metrics_utils import MeasureTime
+from apache_beam.testing.test_pipeline import TestPipeline
+
+
+class CombineTest(unittest.TestCase):
+  def parseTestPipelineOptions(self):
+    return {
+        'numRecords': self.inputOptions.get('num_records'),
+        'keySizeBytes': self.inputOptions.get('key_size'),
+        'valueSizeBytes': self.inputOptions.get('value_size'),
+        'bundleSizeDistribution': {
+            'type': self.inputOptions.get(
+                'bundle_size_distribution_type', 'const'
+            ),
+            'param': self.inputOptions.get('bundle_size_distribution_param', 0)
+        },
+        'forceNumInitialBundles': self.inputOptions.get(
+            'force_initial_num_bundles', 0
+        )
+    }
+
+  def setUp(self):
+    self.pipeline = TestPipeline(is_integration_test=True)
+    self.inputOptions = json.loads(self.pipeline.get_option('input_options'))
+
+  class _GetElement(beam.DoFn):
+    def process(self, element):
+      yield element
+
+  def testCombineGlobally(self):
+    with self.pipeline as p:
+      # pylint: disable=expression-not-assigned
+      (p
+       | beam.io.Read(synthetic_pipeline.SyntheticSource(
+           self.parseTestPipelineOptions()))
+       | 'Measure time' >> beam.ParDo(MeasureTime())
+       | 'Combine with Top' >> beam.CombineGlobally(
+           beam.combiners.TopCombineFn(1000))
+       | 'Consume' >> beam.ParDo(self._GetElement())
+      )
+
+      result = p.run()
+      result.wait_until_finish()
+      metrics = result.metrics().query()
+      for dist in metrics['distributions']:
+        logging.info("Distribution: %s", dist)
+
+
+if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.DEBUG)
+  unittest.main()
diff --git a/sdks/python/apache_beam/testing/load_tests/group_by_key_test.py 
b/sdks/python/apache_beam/testing/load_tests/group_by_key_test.py
new file mode 100644
index 00000000000..d019a122e7b
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/group_by_key_test.py
@@ -0,0 +1,106 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+"""
+To run test on DirectRunner
+
+python setup.py nosetests \
+    --test-pipeline-options="--input_options='{
+    \"num_records\": 300,
+    \"key_size\": 5,
+    \"value_size\":15,
+    \"bundle_size_distribution_type\": \"const\",
+    \"bundle_size_distribution_param\": 1,
+    \"force_initial_num_bundles\": 0
+    }'" \
+    --tests apache_beam.testing.load_tests.group_by_it_test
+
+To run test on other runner (ex. Dataflow):
+
+python setup.py nosetests \
+    --test-pipeline-options="
+        --runner=TestDataflowRunner
+        --project=...
+        --staging_location=gs://...
+        --temp_location=gs://...
+        --sdk_location=./dist/apache-beam-x.x.x.dev0.tar.gz
+        --input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\": 0
+        }'" \
+    --tests apache_beam.testing.load_tests.group_by_it_test
+
+"""
+
+from __future__ import absolute_import
+
+import json
+import logging
+import unittest
+
+import apache_beam as beam
+from apache_beam.testing import synthetic_pipeline
+from apache_beam.testing.load_tests.load_test_metrics_utils import MeasureTime
+from apache_beam.testing.test_pipeline import TestPipeline
+
+
+class GroupByKeyTest(unittest.TestCase):
+  def parseTestPipelineOptions(self):
+    return {
+        'numRecords': self.inputOptions.get('num_records'),
+        'keySizeBytes': self.inputOptions.get('key_size'),
+        'valueSizeBytes': self.inputOptions.get('value_size'),
+        'bundleSizeDistribution': {
+            'type': self.inputOptions.get(
+                'bundle_size_distribution_type', 'const'
+            ),
+            'param': self.inputOptions.get('bundle_size_distribution_param', 0)
+        },
+        'forceNumInitialBundles': self.inputOptions.get(
+            'force_initial_num_bundles', 0
+        )
+    }
+
+  def setUp(self):
+    self.pipeline = TestPipeline(is_integration_test=True)
+    self.inputOptions = json.loads(self.pipeline.get_option('input_options'))
+
+  def testGroupByKey(self):
+    with self.pipeline as p:
+      # pylint: disable=expression-not-assigned
+      (p
+       | beam.io.Read(synthetic_pipeline.SyntheticSource(
+           self.parseTestPipelineOptions()))
+       | 'Measure time' >> beam.ParDo(MeasureTime())
+       | 'GroupByKey' >> beam.GroupByKey()
+       | 'Ungroup' >> beam.FlatMap(
+           lambda elm: [(elm[0], v) for v in elm[1]])
+      )
+
+      result = p.run()
+      result.wait_until_finish()
+      metrics = result.metrics().query()
+      for dist in metrics['distributions']:
+        logging.info("Distribution: %s", dist)
+
+
+if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.DEBUG)
+  unittest.main()
diff --git 
a/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py 
b/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py
new file mode 100644
index 00000000000..393be210d71
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py
@@ -0,0 +1,42 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+"""
+Utility functions used for integrating Metrics API into load tests pipelines.
+"""
+
+from __future__ import absolute_import
+
+import time
+
+import apache_beam as beam
+from apache_beam.metrics import Metrics
+
+
+class MeasureTime(beam.DoFn):
+  def __init__(self):
+    self.runtime_start = Metrics.distribution('pardo', 'runtime.start')
+    self.runtime_end = Metrics.distribution('pardo', 'runtime.end')
+
+  def start_bundle(self):
+    self.runtime_start.update(time.time())
+
+  def finish_bundle(self):
+    self.runtime_end.update(time.time())
+
+  def process(self, element):
+    yield element
diff --git a/sdks/python/apache_beam/testing/load_tests/pardo_test.py 
b/sdks/python/apache_beam/testing/load_tests/pardo_test.py
new file mode 100644
index 00000000000..a1c753ef27d
--- /dev/null
+++ b/sdks/python/apache_beam/testing/load_tests/pardo_test.py
@@ -0,0 +1,156 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+"""
+To run test on DirectRunner
+
+python setup.py nosetests \
+    --test-pipeline-options="
+    --number_of_counter_operations=1000
+    --input_options='{
+    \"num_records\": 300,
+    \"key_size\": 5,
+    \"value_size\":15,
+    \"bundle_size_distribution_type\": \"const\",
+    \"bundle_size_distribution_param\": 1,
+    \"force_initial_num_bundles\": 0
+    }'" \
+    --tests apache_beam.testing.load_tests.par_do_test
+
+To run test on other runner (ex. Dataflow):
+
+python setup.py nosetests \
+    --test-pipeline-options="
+        --runner=TestDataflowRunner
+        --project=...
+        --staging_location=gs://...
+        --temp_location=gs://...
+        --sdk_location=./dist/apache-beam-x.x.x.dev0.tar.gz
+        --output=gc
+        --number_of_counter_operations=1000
+        --input_options='{
+        \"num_records\": 1000,
+        \"key_size\": 5,
+        \"value_size\":15,
+        \"bundle_size_distribution_type\": \"const\",
+        \"bundle_size_distribution_param\": 1,
+        \"force_initial_num_bundles\": 0
+        }'" \
+    --tests apache_beam.testing.load_tests.par_do_test
+
+"""
+
+from __future__ import absolute_import
+
+import json
+import logging
+import time
+import unittest
+
+import apache_beam as beam
+from apache_beam.metrics import Metrics
+from apache_beam.testing import synthetic_pipeline
+from apache_beam.testing.load_tests.load_test_metrics_utils import MeasureTime
+from apache_beam.testing.test_pipeline import TestPipeline
+
+
+class ParDoTest(unittest.TestCase):
+  def parseTestPipelineOptions(self):
+    return {'numRecords': self.inputOptions.get('num_records'),
+            'keySizeBytes': self.inputOptions.get('key_size'),
+            'valueSizeBytes': self.inputOptions.get('value_size'),
+            'bundleSizeDistribution': {
+                'type': self.inputOptions.get(
+                    'bundle_size_distribution_type', 'const'
+                ),
+                'param': self.inputOptions.get(
+                    'bundle_size_distribution_param', 0
+                )
+            },
+            'forceNumInitialBundles': self.inputOptions.get(
+                'force_initial_num_bundles', 0
+            )
+           }
+
+  def setUp(self):
+    self.pipeline = TestPipeline(is_integration_test=True)
+    self.output = self.pipeline.get_option('output')
+    self.iterations = self.pipeline.get_option('number_of_counter_operations')
+    self.inputOptions = json.loads(self.pipeline.get_option('input_options'))
+
+  class _MeasureTime(beam.DoFn):
+    def __init__(self):
+      self.runtime_start = Metrics.distribution('pardo', 'runtime.start')
+      self.runtime_end = Metrics.distribution('pardo', 'runtime.end')
+
+    def start_bundle(self):
+      self.runtime_start.update(time.time())
+
+    def finish_bundle(self):
+      self.runtime_end.update(time.time())
+
+    def process(self, element):
+      yield element
+
+  class _GetElement(beam.DoFn):
+    def __init__(self):
+      self.counter = Metrics.counter('pardo', 'total_bytes.count')
+
+    def process(self, element):
+      _, value = element
+      for i in range(len(value)):
+        self.counter.inc(i)
+      yield element
+
+  def testParDo(self):
+    if self.iterations is None:
+      num_runs = 1
+    else:
+      num_runs = int(self.iterations)
+
+    with self.pipeline as p:
+      pc = (p
+            | 'Read synthetic' >> beam.io.Read(
+                synthetic_pipeline.SyntheticSource(
+                    self.parseTestPipelineOptions()
+                ))
+            | 'Measure time' >> beam.ParDo(MeasureTime())
+           )
+
+      for i in range(num_runs):
+        label = 'Step: %d' % i
+        pc = (pc
+              | label >> beam.ParDo(self._GetElement()))
+
+      if self.output is not None:
+        # pylint: disable=expression-not-assigned
+        (pc
+         | "Write" >> beam.io.WriteToText(self.output)
+        )
+
+      result = p.run()
+      result.wait_until_finish()
+      metrics = result.metrics().query()
+      for counter in metrics['counters']:
+        logging.info("Counter: %s", counter)
+
+      for dist in metrics['distributions']:
+        logging.info("Distribution: %s", dist)
+
+
+if __name__ == '__main__':
+  logging.getLogger().setLevel(logging.INFO)
+  unittest.main()


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 157585)
    Time Spent: 2h  (was: 1h 50m)

> Load tests for SyntheticSources in Python
> -----------------------------------------
>
>                 Key: BEAM-5758
>                 URL: https://issues.apache.org/jira/browse/BEAM-5758
>             Project: Beam
>          Issue Type: Test
>          Components: testing
>            Reporter: Kasia Kucharczyk
>            Assignee: Kasia Kucharczyk
>            Priority: Major
>          Time Spent: 2h
>  Remaining Estimate: 0h
>
> For purpose of load testing the SyntheticSources there should be created 
> tests with following transformations:
>  * ParDo
>  * Combine
>  * GroupByKey
>  * CoGroupByKey.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to