damccorm commented on code in PR #32412:
URL: https://github.com/apache/beam/pull/32412#discussion_r1757590423


##########
website/www/site/content/en/blog/unit-testing-in-beam.md:
##########
@@ -178,20 +178,21 @@ def test_error_message_wrong_length(self, mock_get_data):
      result
 ```
 
-###The following cover other testing best practices:
+## Other testing best practices:
 
-1) Test all error messages you raise.
-2) Cover any edge cases that might be present in your data.
-3) Notice that in example 1, we could have written the `beam.Map` step with 
lambda functions:
+1) Test all error messages that you raise.
+2) Cover any edge cases that might exist in your data.
+3) Example 1 could have written the `beam.Map` step with lambda functions 
instead of with `beam.Map(median_house_value_per_bedroom)`:
 
 ```
 beam.Map(lambda x: x.strip().split(',')) | beam.Map(lambda x: 
float(x[8])/float(x[4])
 ```
 
-, instead of `beam.Map(median_house_value_per_bedroom)`.  The latter 
(separating lambdas into a helper function) is the recommended approach for 
more testable code, as changes to the function would be modularized.
-5) Use the `assert_that` statement to ensure that PCollection values match up 
correctly, such as done above
+Separating lambdas into a helper function by using 
`beam.Map(median_house_value_per_bedroom)` is the recommended approach for more 
testable code, because changes to the function would be modularized.
+
+4) Use the `assert_that` statement to ensure that `PCollection` values match 
correctly, as in the previous example.
 
-For more pointed guidance on testing on Beam/Dataflow, see the [Google Cloud 
documentation](https://cloud.google.com/dataflow/docs/guides/develop-and-test-pipelines).
 Additionally, see some more examples of unit testing in Beam 
[here](https://github.com/apache/beam/blob/736cf50430b375d32093e793e1556567557614e9/sdks/python/apache_beam/ml/inference/base_test.py#L262).
+For more guidance about testing on Beam and Dataflow, see the [Google Cloud 
documentation](https://cloud.google.com/dataflow/docs/guides/develop-and-test-pipelines).
 For more examples of unit testing in Beam, see [the `base_test.py` 
code](https://github.com/apache/beam/blob/736cf50430b375d32093e793e1556567557614e9/sdks/python/apache_beam/ml/inference/base_test.py#L262).

Review Comment:
   ```suggestion
   For more guidance about testing on Beam and Dataflow, see the [Google Cloud 
documentation](https://cloud.google.com/dataflow/docs/guides/develop-and-test-pipelines).
 For more examples of unit testing in Beam, see [the base_test.py 
code](https://github.com/apache/beam/blob/736cf50430b375d32093e793e1556567557614e9/sdks/python/apache_beam/ml/inference/base_test.py#L262).
   ```
   
   This renders awkwardly with the backticks



##########
website/www/site/content/en/blog/unit-testing-in-beam.md:
##########
@@ -178,20 +178,21 @@ def test_error_message_wrong_length(self, mock_get_data):
      result
 ```
 
-###The following cover other testing best practices:
+## Other testing best practices:
 
-1) Test all error messages you raise.
-2) Cover any edge cases that might be present in your data.
-3) Notice that in example 1, we could have written the `beam.Map` step with 
lambda functions:
+1) Test all error messages that you raise.
+2) Cover any edge cases that might exist in your data.
+3) Example 1 could have written the `beam.Map` step with lambda functions 
instead of with `beam.Map(median_house_value_per_bedroom)`:
 
 ```
 beam.Map(lambda x: x.strip().split(',')) | beam.Map(lambda x: 
float(x[8])/float(x[4])
 ```
 
-, instead of `beam.Map(median_house_value_per_bedroom)`.  The latter 
(separating lambdas into a helper function) is the recommended approach for 
more testable code, as changes to the function would be modularized.
-5) Use the `assert_that` statement to ensure that PCollection values match up 
correctly, such as done above
+Separating lambdas into a helper function by using 
`beam.Map(median_house_value_per_bedroom)` is the recommended approach for more 
testable code, because changes to the function would be modularized.
+
+4) Use the `assert_that` statement to ensure that `PCollection` values match 
correctly, as in the previous example.
 
-For more pointed guidance on testing on Beam/Dataflow, see the [Google Cloud 
documentation](https://cloud.google.com/dataflow/docs/guides/develop-and-test-pipelines).
 Additionally, see some more examples of unit testing in Beam 
[here](https://github.com/apache/beam/blob/736cf50430b375d32093e793e1556567557614e9/sdks/python/apache_beam/ml/inference/base_test.py#L262).
+For more guidance about testing on Beam and Dataflow, see the [Google Cloud 
documentation](https://cloud.google.com/dataflow/docs/guides/develop-and-test-pipelines).
 For more examples of unit testing in Beam, see [the `base_test.py` 
code](https://github.com/apache/beam/blob/736cf50430b375d32093e793e1556567557614e9/sdks/python/apache_beam/ml/inference/base_test.py#L262).

Review Comment:
   
https://apache-beam-website-pull-requests.storage.googleapis.com/32412/blog/unit-testing-in-beam/index.html



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@beam.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to