See <https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/420/display/redirect?page=changes>
Changes: [valentyn] Add version guards to requirements file for integration tests. [lcwik] [BEAM-9030] Align version of protoc/protoc-gen-grpc-java to vendored [lukecwik] [BEAM-7951] Supports multiple inputs/outputs for wire coder settings. ------------------------------------------ [...truncated 33.59 MB...] at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43) at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524) at org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44) at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory$1.close(ExternalEnvironmentFactory.java:155) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:481) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:481) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:496) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$1800(DefaultJobBundleFactory.java:436) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:168) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909) at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:258) at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.scheduleRelease(ReferenceCountingExecutableStageContextFactory.java:132) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.access$300(ReferenceCountingExecutableStageContextFactory.java:44) at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.close(ReferenceCountingExecutableStageContextFactory.java:204) at org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.$closeResource(ExecutableStageDoFnOperator.java:489) at org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.dispose(ExecutableStageDoFnOperator.java:489) at org.apache.flink.streaming.runtime.tasks.StreamTask.tryDisposeAllOperators(StreamTask.java:562) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:443) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530) at java.lang.Thread.run(Thread.java:748) INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0 INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:41213. INFO:apache_beam.runners.worker.sdk_worker:Control channel established. INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers. [grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 32-1 [[4]assert_that/{Create, Group} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group} (2/2) (cfca87210dbaf9680bcdaceb90a6ad52) switched from RUNNING to FINISHED. [[4]assert_that/{Create, Group} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [4]assert_that/{Create, Group} (2/2) (cfca87210dbaf9680bcdaceb90a6ad52). [[1]Create/FlatMap(<lambda at core.py:2597>) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at core.py:2597>) (1/2) (b2ab629ab72de89da3fed309a7394cc3) switched from RUNNING to FINISHED. [[1]Create/FlatMap(<lambda at core.py:2597>) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]Create/FlatMap(<lambda at core.py:2597>) (1/2) (b2ab629ab72de89da3fed309a7394cc3). [[4]assert_that/{Create, Group} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [4]assert_that/{Create, Group} (2/2) (cfca87210dbaf9680bcdaceb90a6ad52) [FINISHED] [[1]Create/FlatMap(<lambda at core.py:2597>) (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]Create/FlatMap(<lambda at core.py:2597>) (1/2) (b2ab629ab72de89da3fed309a7394cc3) [FINISHED] [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [4]assert_that/{Create, Group} cfca87210dbaf9680bcdaceb90a6ad52. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]Create/FlatMap(<lambda at core.py:2597>) b2ab629ab72de89da3fed309a7394cc3. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [4]assert_that/{Create, Group} (2/2) (cfca87210dbaf9680bcdaceb90a6ad52) switched from RUNNING to FINISHED. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]Create/FlatMap(<lambda at core.py:2597>) (1/2) (b2ab629ab72de89da3fed309a7394cc3) switched from RUNNING to FINISHED. INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:34383. INFO:apache_beam.runners.worker.sdk_worker:State channel established. INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:35713 [grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected. [[1]Create/FlatMap(<lambda at core.py:2597>) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [1]Create/FlatMap(<lambda at core.py:2597>) (2/2) (726bd62e925ed500ce63443f43d7f457) switched from RUNNING to FINISHED. [[1]Create/FlatMap(<lambda at core.py:2597>) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [1]Create/FlatMap(<lambda at core.py:2597>) (2/2) (726bd62e925ed500ce63443f43d7f457). [[1]Create/FlatMap(<lambda at core.py:2597>) (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [1]Create/FlatMap(<lambda at core.py:2597>) (2/2) (726bd62e925ed500ce63443f43d7f457) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [1]Create/FlatMap(<lambda at core.py:2597>) 726bd62e925ed500ce63443f43d7f457. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [1]Create/FlatMap(<lambda at core.py:2597>) (2/2) (726bd62e925ed500ce63443f43d7f457) switched from RUNNING to FINISHED. [[4]assert_that/{Create, Group} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [4]assert_that/{Create, Group} (1/2) (6f91aeb4c20eb94113f531798417c996) switched from RUNNING to FINISHED. [[4]assert_that/{Create, Group} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [4]assert_that/{Create, Group} (1/2) (6f91aeb4c20eb94113f531798417c996). [[4]assert_that/{Create, Group} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [4]assert_that/{Create, Group} (1/2) (6f91aeb4c20eb94113f531798417c996) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [4]assert_that/{Create, Group} 6f91aeb4c20eb94113f531798417c996. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [4]assert_that/{Create, Group} (1/2) (6f91aeb4c20eb94113f531798417c996) switched from RUNNING to FINISHED. [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) (005b811cd1b32dc6bc56cc9050ede0b1) switched from RUNNING to FINISHED. [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) (005b811cd1b32dc6bc56cc9050ede0b1). [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) (005b811cd1b32dc6bc56cc9050ede0b1) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem 005b811cd1b32dc6bc56cc9050ede0b1. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (2/2) (005b811cd1b32dc6bc56cc9050ede0b1) switched from RUNNING to FINISHED. [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) (97fc0fddca396496c805e5105db1e4ff) switched from RUNNING to FINISHED. [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) (97fc0fddca396496c805e5105db1e4ff). [[3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) (97fc0fddca396496c805e5105db1e4ff) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem 97fc0fddca396496c805e5105db1e4ff. [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2) (06ffd0612739fb8b43508fda36454119) switched from RUNNING to FINISHED. [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2) (06ffd0612739fb8b43508fda36454119). [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - [3]{Create, Map(<lambda at fn_api_runner_test.py:599>), WindowInto(WindowIntoFn)} -> ToKeyedWorkItem (1/2) (97fc0fddca396496c805e5105db1e4ff) switched from RUNNING to FINISHED. [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2) (06ffd0612739fb8b43508fda36454119) [FINISHED] [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} 06ffd0612739fb8b43508fda36454119. [ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - ToKeyedWorkItem (2/2) (eee613af7d96cc61f243515bcee00488) switched from RUNNING to FINISHED. [ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ToKeyedWorkItem (2/2) (eee613af7d96cc61f243515bcee00488). [ToKeyedWorkItem (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (2/2) (eee613af7d96cc61f243515bcee00488) [FINISHED] [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem eee613af7d96cc61f243515bcee00488. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (2/2) (06ffd0612739fb8b43508fda36454119) switched from RUNNING to FINISHED. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (2/2) (eee613af7d96cc61f243515bcee00488) switched from RUNNING to FINISHED. [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2) (a52a59556bcbdfdda79f82ea584fc6e5) switched from RUNNING to FINISHED. [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2) (a52a59556bcbdfdda79f82ea584fc6e5). [GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2) (a52a59556bcbdfdda79f82ea584fc6e5) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} a52a59556bcbdfdda79f82ea584fc6e5. [ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - ToKeyedWorkItem (1/2) (bf4f4112c0c61b64c00259344d2b7f4d) switched from RUNNING to FINISHED. [ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for ToKeyedWorkItem (1/2) (bf4f4112c0c61b64c00259344d2b7f4d). [ToKeyedWorkItem (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task ToKeyedWorkItem (1/2) (bf4f4112c0c61b64c00259344d2b7f4d) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task ToKeyedWorkItem bf4f4112c0c61b64c00259344d2b7f4d. [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) (154c786cb4eef65c608d75ccdbee8351) switched from RUNNING to FINISHED. [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) (154c786cb4eef65c608d75ccdbee8351). [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) (154c786cb4eef65c608d75ccdbee8351) [FINISHED] [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} 154c786cb4eef65c608d75ccdbee8351. [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupByKey -> [5]{Map(<lambda at fn_api_runner_test.py:602>), assert_that} (1/2) (a52a59556bcbdfdda79f82ea584fc6e5) switched from RUNNING to FINISHED. [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - ToKeyedWorkItem (1/2) (bf4f4112c0c61b64c00259344d2b7f4d) switched from RUNNING to FINISHED. [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (2/2) (154c786cb4eef65c608d75ccdbee8351) switched from RUNNING to FINISHED. [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory - Closing environment urn: "beam:env:external:v1" payload: "\n\021\022\017localhost:34665" [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint. ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane. Traceback (most recent call last): File "apache_beam/runners/worker/data_plane.py", line 423, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1579157269.119063647","description":"Error received from peer ipv4:127.0.0.1:35713","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > Exception in thread read_grpc_client_inputs: Traceback (most recent call last): File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner self.run() File "/usr/lib/python2.7/threading.py", line 754, in run self.__target(*self.__args, **self.__kwargs) File "apache_beam/runners/worker/data_plane.py", line 438, in <lambda> target=lambda: self._read_inputs(elements_iterator), File "apache_beam/runners/worker/data_plane.py", line 423, in _read_inputs for elements in elements_iterator: File "<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next return self._next() File "<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 703, in _next raise self _MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with: status = StatusCode.CANCELLED details = "Multiplexer hanging up" debug_error_string = "{"created":"@1579157269.119063647","description":"Error received from peer ipv4:127.0.0.1:35713","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer hanging up","grpc_status":1}" > INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) (69d3235942f99724e9c43a2ba537e15b) switched from RUNNING to FINISHED. [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) (69d3235942f99724e9c43a2ba537e15b). [assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) (69d3235942f99724e9c43a2ba537e15b) [FINISHED] [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} 69d3235942f99724e9c43a2ba537e15b. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2) (69d3235942f99724e9c43a2ba537e15b) switched from RUNNING to FINISHED. INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job test_windowing_1579157266.73 (5112e94d13e779dc1ee208c1048ab6f9) switched from state RUNNING to FINISHED. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Stopping checkpoint coordinator for job 5112e94d13e779dc1ee208c1048ab6f9. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.checkpoint.StandaloneCompletedCheckpointStore - Shutting down INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers. INFO:apache_beam.runners.worker.sdk_worker:Done consuming work. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 5112e94d13e779dc1ee208c1048ab6f9 reached globally terminal state FINISHED. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job test_windowing_1579157266.73(5112e94d13e779dc1ee208c1048ab6f9). [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8130}, allocationId: 649315cfa251960e4cd1507b47bcef80, jobId: 5112e94d13e779dc1ee208c1048ab6f9). [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=8130}, allocationId: 43e38bb1f44b8de3cee2f46c57bfc74f, jobId: 5112e94d13e779dc1ee208c1048ab6f9). [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job 5112e94d13e779dc1ee208c1048ab6f9 from job leader monitoring. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 5112e94d13e779dc1ee208c1048ab6f9. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 400aa3ef55c45ce035d0c88b84c9894a: JobManager is shutting down.. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job 5112e94d13e779dc1ee208c1048ab6f9. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job 5112e94d13e779dc1ee208c1048ab6f9 because it is not registered. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool. [flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager a0ac6924fdb8ef16353fd243227e4b41@akka://flink/user/jobmanager_63 for job 5112e94d13e779dc1ee208c1048ab6f9 from the resource manager. [flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_62. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 400aa3ef55c45ce035d0c88b84c9894a. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection bb348ad3-d98c-4ff1-928b-505fe0defa6f because: The TaskExecutor is shutting down. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-38adb963-393d-4811-8cc2-4001b0fe40c6 [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-8372c10d-df51-4364-baf1-730af77a0703 [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-5d2d63b3-b190-47b9-b8f4-5629066ed3f1 [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_62. [ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui [ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete. [flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed.. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager. [flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher. [flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service. [flink-akka.actor.default-dispatcher-3] INFO org.apache.beam.runners.flink.metrics.FileReporter - wrote metrics to /tmp/flinktest-confk2p6Y4/test-metrics.txt [flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon. [flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports. [flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down. [flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service. [flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service. [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:41819 [flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service. [flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 546 msecs [flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values: [flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2597>)_4}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 5, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 5, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 5, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 5, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:602>)_23}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_27:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2597>)_27}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: 1, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_31}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: 1, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:602>)_23}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2597>)_27}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: 1, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2597>)_27}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_29}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_WindowInto(WindowIntoFn)_18}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 2, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 5, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:602>)_23}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: 2, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2597>)_27}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_2:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:599>)_17}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_41}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: 2, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:599>)_17}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_17:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:602>)_23}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2597>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/Map(decode)_16}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:1:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_34}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_24:0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 2, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: 2, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_42}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2597>)_4}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:599>)_17}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 2, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Map(<lambda at fn_api_runner_test.py:599>)_17}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2597>)_4}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_35}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_15:0}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_12:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_33}: 0, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0)Distributions(14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=70, count=5, min=14, max=14}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=100, count=5, min=20, max=20}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:0}: DistributionResult{sum=60, count=2, min=29, max=31}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=140, count=5, min=28, max=28}, 48Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=70, count=5, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_24:1}: DistributionResult{sum=19, count=1, min=19, max=19}, 14Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=70, count=5, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=15, count=1, min=15, max=15}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=75, count=2, min=36, max=39}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_18}: DistributionResult{sum=15, count=1, min=15, max=15}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16}: DistributionResult{sum=66, count=2, min=32, max=34}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_17}: DistributionResult{sum=13, count=1, min=13, max=13}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_30}: DistributionResult{sum=14, count=1, min=14, max=14}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_28}: DistributionResult{sum=47, count=1, min=47, max=47}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_29}: DistributionResult{sum=39, count=1, min=39, max=39}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=17, count=1, min=17, max=17}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=46, count=2, min=22, max=24}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=48, count=2, min=23, max=25}, 24GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_23}: DistributionResult{sum=52, count=2, min=25, max=27}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_27}: DistributionResult{sum=59, count=1, min=59, max=59})) [flink-runner-job-invoker] WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_dbe3901f-5a6b-4dcb-a702-f8846306243a","basePath":"/tmp/flinktestSa4K9Q"}: {} java.io.FileNotFoundException: /tmp/flinktestSa4K9Q/job_dbe3901f-5a6b-4dcb-a702-f8846306243a/MANIFEST (No such file or directory) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195) at java.io.FileInputStream.<init>(FileInputStream.java:138) at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118) at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82) at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252) at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88) at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92) at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63) at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:247) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:113) at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:99) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE .INFO:__main__:removing conf dir: /tmp/flinktest-confk2p6Y4 ---------------------------------------------------------------------- Ran 78 tests in 186.335s OK (skipped=14) FAILURE: Build failed with an exception. * Where: Script '<https://builds.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 55 * What went wrong: Execution failed for task ':sdks:python:test-suites:portable:py2:flinkCompatibilityMatrixBatchLOOPBACK'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 36m 0s 72 actionable tasks: 54 executed, 17 from cache, 1 up-to-date Publishing build scan... https://gradle.com/s/xq6uuc7wkbtis Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org