See 
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1161/display/redirect?page=changes>

Changes:

[robertwb] Add mapper microbenchmark.

[robertwb] Remove from docs generation.

------------------------------------------
[...truncated 774.42 KB...]
        "serialized_fn": "<string of 1160 bytes>", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": 
"VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==",
 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-22T08:58:18.381204Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-22_01_58_17-7692825631103106013'
 location: u'us-central1'
 name: u'beamapp-jenkins-0322085810-054988'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-22_01_58_17-7692825631103106013]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_58_17-7692825631103106013?project=apache-beam-testing
root: INFO: Job 2018-03-22_01_58_17-7692825631103106013 is in state 
JOB_STATE_PENDING
root: INFO: 2018-03-22T08:58:17.641Z: JOB_MESSAGE_WARNING: Job 
2018-03-22_01_58_17-7692825631103106013 might autoscale up to 250 workers.
root: INFO: 2018-03-22T08:58:17.660Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2018-03-22_01_58_17-7692825631103106013. The number of workers 
will be between 1 and 250.
root: INFO: 2018-03-22T08:58:17.673Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2018-03-22_01_58_17-7692825631103106013.
root: INFO: 2018-03-22T08:58:20.992Z: JOB_MESSAGE_DETAILED: Checking required 
Cloud APIs are enabled.
root: INFO: 2018-03-22T08:58:21.116Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2018-03-22T08:58:21.976Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-22T08:58:21.993Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2018-03-22T08:58:22.050Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2018-03-22T08:58:22.063Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-22T08:58:22.089Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2018-03-22T08:58:22.113Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-22T08:58:22.161Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s11 for input s10.out
root: INFO: 2018-03-22T08:58:22.177Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Reify, through flatten 
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-22T08:58:22.190Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-22T08:58:22.212Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-22T08:58:22.278Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-22T08:58:22.298Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-22T08:58:22.310Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-22T08:58:22.331Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into 
producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-22T08:58:22.345Z: JOB_MESSAGE_DETAILED: Fusing consumer 
compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-22T08:58:22.413Z: JOB_MESSAGE_DETAILED: Fusing consumer 
compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-22T08:58:22.426Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-22T08:58:22.437Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-22T08:58:22.447Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-22T08:58:22.458Z: JOB_MESSAGE_DETAILED: Fusing consumer 
compute/compute into start/Read
root: INFO: 2018-03-22T08:58:22.471Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-22T08:58:22.545Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-22T08:58:22.564Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-22T08:58:22.579Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2018-03-22T08:58:22.594Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2018-03-22T08:58:22.666Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2018-03-22T08:58:22.680Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-22T08:58:22.806Z: JOB_MESSAGE_DEBUG: Executing wait step 
start22
root: INFO: 2018-03-22T08:58:22.842Z: JOB_MESSAGE_BASIC: Executing operation 
side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-22T08:58:22.908Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-22T08:58:22.920Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2018-03-22T08:58:22.960Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-f...
root: INFO: 2018-03-22T08:58:23.009Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-22T08:58:23.048Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-22_01_58_17-7692825631103106013 is in state 
JOB_STATE_RUNNING
root: INFO: 2018-03-22T08:58:32.001Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 0 based on the rate of progress in the currently 
running step(s).
root: INFO: 2018-03-22T08:58:47.959Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised 
the number of workers to 1 based on the rate of progress in the currently 
running step(s).
root: INFO: 2018-03-22T08:59:56.871Z: JOB_MESSAGE_DETAILED: Workers have 
started successfully.
root: INFO: 2018-03-22T09:03:09.601Z: JOB_MESSAGE_DEBUG: Value 
"compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-22T09:03:09.666Z: JOB_MESSAGE_BASIC: Executing operation 
compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-22T09:03:09.765Z: JOB_MESSAGE_DEBUG: Value 
"compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-22T09:03:09.842Z: JOB_MESSAGE_BASIC: Executing operation 
start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-22T09:03:18.549Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in 
apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in 
apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in 
apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in 
apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in 
apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in 
apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", 
line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute 
'_from_runtime_iterable'

root: INFO: 2018-03-22T09:03:21.915Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in 
apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in 
apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in 
apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in 
apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in 
apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in 
apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", 
line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute 
'_from_runtime_iterable'

root: INFO: 2018-03-22T09:03:25.280Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in 
apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in 
apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in 
apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in 
apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in 
apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in 
apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", 
line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute 
'_from_runtime_iterable'

root: INFO: 2018-03-22T09:03:27.632Z: JOB_MESSAGE_ERROR: Traceback (most recent 
call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", 
line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", 
line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in 
apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in 
apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in 
apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in 
apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in 
apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in 
apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File 
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", 
line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute 
'_from_runtime_iterable'

root: INFO: 2018-03-22T09:03:27.675Z: JOB_MESSAGE_DEBUG: Executing failure step 
failure21
root: INFO: 2018-03-22T09:03:27.698Z: JOB_MESSAGE_ERROR: Workflow failed. 
Causes: 
S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
 failed., A work item was attempted 4 times without success. Each time the 
worker eventually lost contact with the service. The work item was attempted 
on: 
  beamapp-jenkins-032208581-03220158-be88-harness-zn22,
  beamapp-jenkins-032208581-03220158-be88-harness-zn22,
  beamapp-jenkins-032208581-03220158-be88-harness-zn22,
  beamapp-jenkins-032208581-03220158-be88-harness-zn22
root: INFO: 2018-03-22T09:03:27.809Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-22T09:03:27.849Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2018-03-22T09:03:27.874Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-22T09:04:54.318Z: JOB_MESSAGE_DETAILED: Autoscaling: 
Reduced the number of workers to 0 based on the rate of progress in the 
currently running step(s).
root: INFO: 2018-03-22T09:04:54.354Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2018-03-22_01_58_17-7692825631103106013 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1704.813s

FAILED (errors=9)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_37_47-13823145180391713259?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_44_41-9967498865034380562?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_52_06-49136134126293855?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_57_55-15829709246401676104?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_37_48-6523449032820393493?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_45_03-650836264486669640?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_51_38-14371974433753604431?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_58_17-7692825631103106013?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_37_47-11560623165462055443?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_45_08-11478375294299167663?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_52_12-10559575549352155974?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_59_21-8668743851652441267?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_37_46-14067669291675476509?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_44_30-15526957166087225410?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_51_15-13273917207383285661?project=apache-beam-testing
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_01_58_03-3531309710895827256?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user hero...@google.com
Not sending mail to unregistered user aal...@gmail.com
Not sending mail to unregistered user mark...@google.com

Reply via email to