See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/6/display/redirect>
------------------------------------------ [...truncated 550.64 KB...] "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "assert:even/Match.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s31" }, "serialized_fn": "<string of 1148 bytes>", "user_name": "assert:even/Match" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: <Job createTime: u'2017-10-03T22:16:49.017479Z' currentStateTime: u'1970-01-01T00:00:00Z' id: u'2017-10-03_15_16_48-10793923670148269614' location: u'us-central1' name: u'beamapp-jenkins-1003221647-351871' projectId: u'apache-beam-testing' stageStates: [] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2017-10-03_15_16_48-10793923670148269614] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_16_48-10793923670148269614?project=apache-beam-testing root: INFO: Job 2017-10-03_15_16_48-10793923670148269614 is in state JOB_STATE_PENDING root: INFO: 2017-10-03T22:16:48.595Z: JOB_MESSAGE_WARNING: (95cbb863abde436a): Setting the number of workers (1) disables autoscaling for this job. If you are trying to cap autoscaling, consider only setting max_num_workers. If you want to disable autoscaling altogether, the documented way is to explicitly use autoscalingAlgorithm=NONE. root: INFO: 2017-10-03T22:16:50.730Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9bbf): Checking required Cloud APIs are enabled. root: INFO: 2017-10-03T22:16:51.467Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9680): Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2017-10-03T22:16:51.475Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa940f): Combiner lifting skipped for step assert:even/Group/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2017-10-03T22:16:51.477Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9d19): Combiner lifting skipped for step assert:odd/Group/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2017-10-03T22:16:51.480Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9623): Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2017-10-03T22:16:51.482Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9f2d): Expanding GroupByKey operations into optimizable parts. root: INFO: 2017-10-03T22:16:51.486Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9837): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2017-10-03T22:16:51.496Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9c5f): Annotating graph with Autotuner information. root: INFO: 2017-10-03T22:16:51.511Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9e73): Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2017-10-03T22:16:51.514Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa977d): Unzipping flatten s8 for input s6.out root: INFO: 2017-10-03T22:16:51.517Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9087): Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten , into producer assert_that/Group/pair_with_0 root: INFO: 2017-10-03T22:16:51.519Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9991): Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key) root: INFO: 2017-10-03T22:16:51.522Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa929b): Fusing consumer assert_that/Match into assert_that/Unkey root: INFO: 2017-10-03T22:16:51.524Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9ba5): Fusing consumer assert:odd/Unkey into assert:odd/Group/Map(_merge_tagged_vals_under_key) root: INFO: 2017-10-03T22:16:51.526Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa94af): Fusing consumer assert:odd/Match into assert:odd/Unkey root: INFO: 2017-10-03T22:16:51.529Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9db9): Fusing consumer assert:odd/Group/Map(_merge_tagged_vals_under_key) into assert:odd/Group/GroupByKey/GroupByWindow root: INFO: 2017-10-03T22:16:51.531Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa96c3): Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read root: INFO: 2017-10-03T22:16:51.534Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9fcd): Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow root: INFO: 2017-10-03T22:16:51.536Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa98d7): Fusing consumer assert:odd/Group/GroupByKey/GroupByWindow into assert:odd/Group/GroupByKey/Read root: INFO: 2017-10-03T22:16:51.539Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa91e1): Fusing consumer assert:even/Match into assert:even/Unkey root: INFO: 2017-10-03T22:16:51.541Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9aeb): Fusing consumer assert:odd/Group/GroupByKey/Write into assert:odd/Group/GroupByKey/Reify root: INFO: Job 2017-10-03_15_16_48-10793923670148269614 is in state JOB_STATE_RUNNING root: INFO: 2017-10-03T22:16:51.544Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa93f5): Unzipping flatten s18 for input s16.out root: INFO: 2017-10-03T22:16:51.547Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9cff): Fusing unzipped copy of assert:odd/Group/GroupByKey/Reify, through flatten , into producer assert:odd/Group/pair_with_0 root: INFO: 2017-10-03T22:16:51.550Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9609): Unzipping flatten s8-u31 for input s9-reify-value18-c29 root: INFO: 2017-10-03T22:16:51.552Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9f13): Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten , into producer assert_that/Group/GroupByKey/Reify root: INFO: 2017-10-03T22:16:51.555Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa981d): Unzipping flatten s28 for input s26.out root: INFO: 2017-10-03T22:16:51.559Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9127): Fusing unzipped copy of assert:even/Group/GroupByKey/Reify, through flatten , into producer assert:even/Group/pair_with_0 root: INFO: 2017-10-03T22:16:51.561Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9a31): Fusing consumer assert:even/Group/GroupByKey/GroupByWindow into assert:even/Group/GroupByKey/Read root: INFO: 2017-10-03T22:16:51.563Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa933b): Fusing consumer assert:even/Unkey into assert:even/Group/Map(_merge_tagged_vals_under_key) root: INFO: 2017-10-03T22:16:51.566Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9c45): Fusing consumer assert:even/Group/Map(_merge_tagged_vals_under_key) into assert:even/Group/GroupByKey/GroupByWindow root: INFO: 2017-10-03T22:16:51.568Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa954f): Unzipping flatten s28-u42 for input s29-reify-value9-c40 root: INFO: 2017-10-03T22:16:51.571Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9e59): Fusing unzipped copy of assert:even/Group/GroupByKey/Write, through flatten , into producer assert:even/Group/GroupByKey/Reify root: INFO: 2017-10-03T22:16:51.575Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9763): Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1 root: INFO: 2017-10-03T22:16:51.577Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa906d): Fusing consumer assert:even/Group/GroupByKey/Write into assert:even/Group/GroupByKey/Reify root: INFO: 2017-10-03T22:16:51.580Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9977): Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify root: INFO: 2017-10-03T22:16:51.584Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9281): Fusing consumer assert:even/Group/GroupByKey/Reify into assert:even/Group/pair_with_1 root: INFO: 2017-10-03T22:16:51.587Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9b8b): Fusing consumer assert:odd/Group/GroupByKey/Reify into assert:odd/Group/pair_with_1 root: INFO: 2017-10-03T22:16:51.589Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9495): Fusing consumer assert:even/ToVoidKey into assert:even/WindowInto(WindowIntoFn) root: INFO: 2017-10-03T22:16:51.592Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9d9f): Fusing consumer assert:even/Group/pair_with_1 into assert:even/ToVoidKey root: INFO: 2017-10-03T22:16:51.596Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa96a9): Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey root: INFO: 2017-10-03T22:16:51.599Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9fb3): Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn) root: INFO: 2017-10-03T22:16:51.601Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa98bd): Fusing consumer assert:even/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>) root: INFO: 2017-10-03T22:16:51.604Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa91c7): Fusing consumer assert:odd/ToVoidKey into assert:odd/WindowInto(WindowIntoFn) root: INFO: 2017-10-03T22:16:51.606Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9ad1): Fusing consumer assert_that/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>) root: INFO: 2017-10-03T22:16:51.609Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa93db): Fusing consumer assert:odd/Group/pair_with_1 into assert:odd/ToVoidKey root: INFO: 2017-10-03T22:16:51.611Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9ce5): Fusing consumer assert:odd/WindowInto(WindowIntoFn) into ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>) root: INFO: 2017-10-03T22:16:51.614Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa95ef): Fusing consumer ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>) into Some Numbers/Read root: INFO: 2017-10-03T22:16:51.616Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9ef9): Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read root: INFO: 2017-10-03T22:16:51.619Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9803): Fusing consumer assert:even/Group/pair_with_0 into assert:even/Create/Read root: INFO: 2017-10-03T22:16:51.623Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa910d): Fusing consumer assert:odd/Group/pair_with_0 into assert:odd/Create/Read root: INFO: 2017-10-03T22:16:51.630Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9a17): Workflow config is missing a default resource spec. root: INFO: 2017-10-03T22:16:51.633Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9321): Adding StepResource setup and teardown to workflow graph. root: INFO: 2017-10-03T22:16:51.636Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9c2b): Adding workflow start and stop steps. root: INFO: 2017-10-03T22:16:51.639Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9535): Assigning stage ids. root: INFO: 2017-10-03T22:16:51.780Z: JOB_MESSAGE_DEBUG: (90e87d0f3e73a2db): Executing wait step start54 root: INFO: 2017-10-03T22:16:51.813Z: JOB_MESSAGE_BASIC: (3f32590822c78eb3): Executing operation assert:odd/Group/GroupByKey/Create root: INFO: 2017-10-03T22:16:51.830Z: JOB_MESSAGE_DEBUG: (3dd540555683d85c): Starting worker pool setup. root: INFO: 2017-10-03T22:16:51.837Z: JOB_MESSAGE_BASIC: (3dd540555683dbaa): Starting 1 workers in us-central1-f... root: INFO: 2017-10-03T22:16:51.845Z: JOB_MESSAGE_BASIC: (90e87d0f3e73a972): Executing operation assert:even/Group/GroupByKey/Create root: INFO: 2017-10-03T22:16:51.875Z: JOB_MESSAGE_BASIC: (99debcd01cdf3953): Executing operation assert_that/Group/GroupByKey/Create root: INFO: 2017-10-03T22:16:51.905Z: JOB_MESSAGE_DEBUG: (3f32590822c78c5d): Value "assert:odd/Group/GroupByKey/Session" materialized. root: INFO: 2017-10-03T22:16:51.942Z: JOB_MESSAGE_DEBUG: (99debcd01cdf345d): Value "assert:even/Group/GroupByKey/Session" materialized. root: INFO: 2017-10-03T22:16:51.957Z: JOB_MESSAGE_DEBUG: (3f32590822c78a07): Value "assert_that/Group/GroupByKey/Session" materialized. root: INFO: 2017-10-03T22:16:51.981Z: JOB_MESSAGE_BASIC: (90e87d0f3e73a2dc): Executing operation assert:odd/Create/Read+assert:odd/Group/pair_with_0+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write root: INFO: 2017-10-03T22:16:52.013Z: JOB_MESSAGE_BASIC: (3f32590822c78b20): Executing operation assert:even/Create/Read+assert:even/Group/pair_with_0+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write root: INFO: 2017-10-03T22:16:52.051Z: JOB_MESSAGE_BASIC: (99debcd01cdf37e6): Executing operation Some Numbers/Read+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>)+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/pair_with_1+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/pair_with_1+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write root: INFO: 2017-10-03T22:16:52.085Z: JOB_MESSAGE_BASIC: (c0c8842e57a10916): Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write root: INFO: 2017-10-03T22:17:01.084Z: JOB_MESSAGE_DETAILED: (9a7e4ca204ffbfb6): Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2017-10-03T22:17:16.891Z: JOB_MESSAGE_DETAILED: (9a7e4ca204ffb821): Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2017-10-03T22:18:03.828Z: JOB_MESSAGE_DETAILED: (8fa5a35157dd4c1e): Workers have started successfully. root: INFO: 2017-10-03T22:21:14.877Z: JOB_MESSAGE_ERROR: (7f6d75f74edb506): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10591) def start(self): File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10485) with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9686) pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads return load(file) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load obj = pik.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py", line 27, in <module> import hamcrest as hc ImportError: No module named hamcrest root: INFO: 2017-10-03T22:21:18.486Z: JOB_MESSAGE_ERROR: (7f6d75f74edb83a): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10591) def start(self): File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10485) with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9686) pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads return load(file) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load obj = pik.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py", line 27, in <module> import hamcrest as hc ImportError: No module named hamcrest root: INFO: 2017-10-03T22:21:22.204Z: JOB_MESSAGE_ERROR: (7f6d75f74edbb6e): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10591) def start(self): File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10485) with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9686) pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads return load(file) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load obj = pik.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py", line 27, in <module> import hamcrest as hc ImportError: No module named hamcrest root: INFO: 2017-10-03T22:21:25.809Z: JOB_MESSAGE_ERROR: (7f6d75f74edbea2): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 582, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 294, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10591) def start(self): File "apache_beam/runners/worker/operations.py", line 295, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:10485) with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 300, in apache_beam.runners.worker.operations.DoOperation.start (apache_beam/runners/worker/operations.c:9686) pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 225, in loads return dill.loads(s) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads return load(file) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load obj = pik.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 767, in _import_module return getattr(__import__(module, None, None, [obj]), obj) File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py", line 27, in <module> import hamcrest as hc ImportError: No module named hamcrest root: INFO: 2017-10-03T22:21:25.854Z: JOB_MESSAGE_DEBUG: (fc4ea6d4bd70a292): Executing failure step failure53 root: INFO: 2017-10-03T22:21:25.857Z: JOB_MESSAGE_ERROR: (fc4ea6d4bd70aac8): Workflow failed. Causes: (99debcd01cdf33ee): S05:Some Numbers/Read+ClassifyNumbers/FlatMap(<lambda at ptransform_test.py:245>)+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/pair_with_1+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/pair_with_1+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write failed., (2a8164e1ad7f5898): A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: beamapp-jenkins-100322164-10031516-401b-harness-5zkv, beamapp-jenkins-100322164-10031516-401b-harness-5zkv, beamapp-jenkins-100322164-10031516-401b-harness-5zkv, beamapp-jenkins-100322164-10031516-401b-harness-5zkv root: INFO: 2017-10-03T22:21:25.962Z: JOB_MESSAGE_DETAILED: (a2fb37fe80aa9267): Cleaning up. root: INFO: 2017-10-03T22:21:25.965Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9b71): Starting worker pool teardown. root: INFO: 2017-10-03T22:21:25.968Z: JOB_MESSAGE_BASIC: (a2fb37fe80aa947b): Stopping worker pool... root: INFO: 2017-10-03T22:22:26.198Z: JOB_MESSAGE_DETAILED: (9a7e4ca204ffbcc6): Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2017-10-03T22:22:26.235Z: JOB_MESSAGE_DEBUG: (a2fb37fe80aa9ab7): Tearing down pending resources... root: INFO: Job 2017-10-03_15_16_48-10793923670148269614 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 15 tests in 1233.931s FAILED (errors=5) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_11_55-10260111686362628935?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_17_32-6632238737687175964?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_22_14-18348821223512924084?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_27_10-408080182060752159?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_11_56-104824498886235475?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_16_47-3660413317865190474?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_21_59-17560023927188767041?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_27_06-15924466671671558243?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_11_54-17807115556218009545?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_16_56-585840065622121380?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_21_53-1493130550276211788?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_26_59-16581812431703474889?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_11_55-17365124932940978345?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_16_48-10793923670148269614?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2017-10-03_15_22_34-3986678221725099211?project=apache-beam-testing Build step 'Execute shell' marked build as failure