Tyson Hamilton created BEAM-11488:
-------------------------------------
Summary: Spark test failure:
org.apache.beam.sdk.metrics.MetricsTest$AttemptedMetricTests.testAttemptedCounterMetrics
Key: BEAM-11488
URL: https://issues.apache.org/jira/browse/BEAM-11488
Project: Beam
Issue Type: Sub-task
Components: runner-spark, test-failures
Reporter: Tyson Hamilton
h1.
From:
[https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Streaming/464/testReport/org.apache.beam.sdk.metrics/MetricsTest$AttemptedMetricTests/testAttemptedCounterMetrics/]
Others:
[https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Streaming/463/testReport/org.apache.beam.sdk.metrics/MetricsTest$AttemptedMetricTests/testAttemptedCounterMetrics/]
[https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Streaming/462/testReport/org.apache.beam.sdk.metrics/MetricsTest$AttemptedMetricTests/testAttemptedCounterMetrics/]
{code:java}
Failed
org.apache.beam.sdk.metrics.MetricsTest$AttemptedMetricTests.testAttemptedCounterMetrics
Failing for the past 3 builds (Since #462 ) Took 36 sec. Error Message
java.lang.AssertionError: Expected: a collection containing
MetricResult{inNamespace="org.apache.beam.sdk.metrics.MetricsTest",
name="count", step="MyStep1", attempted=<3L>} but: was empty
Stacktrace
java.lang.AssertionError: Expected: a collection containing
MetricResult{inNamespace="org.apache.beam.sdk.metrics.MetricsTest",
name="count", step="MyStep1", attempted=<3L>} but: was empty at
org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at
org.junit.Assert.assertThat(Assert.java:966) at
org.junit.Assert.assertThat(Assert.java:931) at
org.apache.beam.sdk.metrics.MetricsTest.assertCounterMetrics(MetricsTest.java:378)
at org.apache.beam.sdk.metrics.MetricsTest.access$300(MetricsTest.java:60) at
org.apache.beam.sdk.metrics.MetricsTest$AttemptedMetricTests.testAttemptedCounterMetrics(MetricsTest.java:357)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:322)
at
org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:266)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:365) at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:330) at
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:78) at
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:328) at
org.junit.runners.ParentRunner.access$100(ParentRunner.java:65) at
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:292) at
org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at
org.junit.runners.ParentRunner.run(ParentRunner.java:412) at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
at
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
at
org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
at
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
at sun.reflect.GeneratedMethodAccessor156.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source) at
org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119)
at sun.reflect.GeneratedMethodAccessor155.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
at
org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
at
org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
at java.lang.Thread.run(Thread.java:748)
Standard Output
Shutting SDK harness down.
Standard Error
20/12/16 12:34:44 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
ArtifactStagingService started on localhost:45763 20/12/16 12:34:44 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
started on localhost:43129 20/12/16 12:34:44 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
localhost:32955 20/12/16 12:34:46 INFO
org.apache.beam.runners.portability.PortableRunner: Using job server endpoint:
localhost:32955 20/12/16 12:34:46 INFO
org.apache.beam.runners.portability.PortableRunner: PrepareJobResponse:
preparation_id:
"metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4"
artifact_staging_endpoint { url: "localhost:45763" } staging_session_token:
"metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4"
20/12/16 12:34:46 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
artifacts for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4.
20/12/16 12:34:46 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving
artifacts for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4.EMBEDDED.
20/12/16 12:34:46 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting
313 artifacts for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4.null.
20/12/16 12:34:47 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts
fully staged for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_e7119a25-affe-477b-86c4-7910965cbcb4.
Dec 16, 2020 12:34:47 PM
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=2099,
target=directaddress:///InProcessServer_300} was not shutdown properly!!!
~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
allocation site at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at org.apache.beam.fn.harness.FnHarness.main(FnHarness.java:194) at
org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.lambda$createEnvironment$0(EmbeddedEnvironmentFactory.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Dec 16, 2020 12:34:47 PM
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=2103,
target=directaddress:///InProcessServer_305} was not shutdown properly!!!
~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
allocation site at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:89)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:79)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.createBeamFnStateClient(BeamFnStateGrpcClientCache.java:75)
at
java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.forApiServiceDescriptor(BeamFnStateGrpcClientCache.java:71)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.createBundleProcessor(ProcessBundleHandler.java:456)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.lambda$processBundle$0(ProcessBundleHandler.java:284)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler$BundleProcessorCache.get(ProcessBundleHandler.java:572)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:279)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 20/12/16 12:34:47 INFO
org.apache.beam.runners.spark.SparkJobInvoker: Invoking job
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
20/12/16 12:34:47 INFO org.apache.beam.runners.jobsubmission.JobInvocation:
Starting job invocation
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
20/12/16 12:34:47 INFO org.apache.beam.runners.portability.PortableRunner:
RunJobResponse: job_id:
"metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0"
20/12/16 12:34:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will
stage 313 files. (Enable logging at DEBUG level to see which files will be
staged.) 20/12/16 12:34:47 INFO
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand
new Spark Context. 20/12/16 12:34:47 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
on Spark master local[4] 20/12/16 12:34:47 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
on Spark master local[4] 20/12/16 12:34:47 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0:
Pipeline translated successfully. Computing outputs 20/12/16 12:34:47 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:47 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:48 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:48 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:49 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:49 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:50 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:50 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:51 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:51 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:52 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:52 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:53 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:53 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:54 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:54 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:55 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:55 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:56 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:56 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:57 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:57 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:58 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:58 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:59 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:34:59 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:00 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:00 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:01 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:01 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:02 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:02 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:03 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:03 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:04 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:04 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:05 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:05 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:06 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:06 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:07 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:07 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:08 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:08 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:09 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:09 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:10 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:10 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:11 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:11 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:12 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:12 INFO
org.apache.beam.fn.harness.FnHarness: Fn Harness started 20/12/16 12:35:12 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging
client connected. 20/12/16 12:35:12 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:12 WARN
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: No
worker_id header provided in control request 20/12/16 12:35:12 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam
Fn Control client connected with id 20/12/16 12:35:12 INFO
org.apache.beam.fn.harness.FnHarness: Entering instruction processing loop
20/12/16 12:35:12 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 52-2 20/12/16 12:35:12 INFO
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client
connected. 20/12/16 12:35:12 INFO
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService:
getProcessBundleDescriptor request with id 52-3 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: Put new watermark
block: {0=SparkWatermarks{lowWatermark=294247-01-09T04:00:54.775Z,
highWatermark=294247-01-10T04:00:54.775Z,
synchronizedProcessingTime=2020-12-16T12:34:47.285Z}} 20/12/16 12:35:13 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122087500 has completed, watermarks have been
updated. 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:13 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122088000 20/12/16 12:35:13
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122088000 has completed, watermarks have been
updated. 20/12/16 12:35:14 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:14 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:14 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122088500 20/12/16 12:35:14
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122088500 has completed, watermarks have been
updated. 20/12/16 12:35:14 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:14 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:14 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122089000 20/12/16 12:35:14
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122089000 has completed, watermarks have been
updated. 20/12/16 12:35:14 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122089500 20/12/16 12:35:14
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122089500 has completed, watermarks have been
updated. 20/12/16 12:35:15 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:15 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:15 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122090000 20/12/16 12:35:15
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122090000 has completed, watermarks have been
updated. 20/12/16 12:35:15 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122090500 20/12/16 12:35:15
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122090500 has completed, watermarks have been
updated. 20/12/16 12:35:15 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:15 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:15 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122091000 20/12/16 12:35:15
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122091000 has completed, watermarks have been
updated. 20/12/16 12:35:16 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:16 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:16 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122091500 20/12/16 12:35:16
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122091500 has completed, watermarks have been
updated. 20/12/16 12:35:16 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122092000 20/12/16 12:35:16
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122092000 has completed, watermarks have been
updated. 20/12/16 12:35:16 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:16 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:16 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122092500 20/12/16 12:35:16
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122092500 has completed, watermarks have been
updated. 20/12/16 12:35:17 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:17 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:17 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122093000 20/12/16 12:35:17
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122093000 has completed, watermarks have been
updated. 20/12/16 12:35:17 WARN
org.apache.spark.streaming.util.BatchedWriteAheadLog: BatchedWriteAheadLog
Writer queue interrupted. 20/12/16 12:35:17 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122093500 20/12/16 12:35:17
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122093500 has completed, watermarks have been
updated. 20/12/16 12:35:17 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122094000 20/12/16 12:35:17
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122094000 has completed, watermarks have been
updated. 20/12/16 12:35:18 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122094500 20/12/16 12:35:18
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122094500 has completed, watermarks have been
updated. 20/12/16 12:35:18 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122095000 20/12/16 12:35:18
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122095000 has completed, watermarks have been
updated. 20/12/16 12:35:18 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122095500 20/12/16 12:35:18
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122095500 has completed, watermarks have been
updated. 20/12/16 12:35:19 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122096000 20/12/16 12:35:19
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122096000 has completed, watermarks have been
updated. 20/12/16 12:35:19 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122096500 20/12/16 12:35:19
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122096500 has completed, watermarks have been
updated. 20/12/16 12:35:20 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122097000 20/12/16 12:35:20
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122097000 has completed, watermarks have been
updated. 20/12/16 12:35:20 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122097500 20/12/16 12:35:20
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122097500 has completed, watermarks have been
updated. 20/12/16 12:35:20 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122098000 20/12/16 12:35:20
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122098000 has completed, watermarks have been
updated. 20/12/16 12:35:21 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122098500 20/12/16 12:35:21
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122098500 has completed, watermarks have been
updated. 20/12/16 12:35:21 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122099000 20/12/16 12:35:21
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122099000 has completed, watermarks have been
updated. 20/12/16 12:35:22 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122099500 20/12/16 12:35:22
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122099500 has completed, watermarks have been
updated. 20/12/16 12:35:22 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122100000 20/12/16 12:35:22
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122100000 has completed, watermarks have been
updated. 20/12/16 12:35:23 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122100500 20/12/16 12:35:23
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122100500 has completed, watermarks have been
updated. 20/12/16 12:35:23 INFO
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing
environment urn: "EMBEDDED" capabilities: "beam:coder:bytes:v1" capabilities:
"beam:coder:bool:v1" capabilities: "beam:coder:varint:v1" capabilities:
"beam:coder:string_utf8:v1" capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:timer:v1" capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:length_prefix:v1" capabilities:
"beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:windowed_value:v1" capabilities:
"beam:coder:double:v1" capabilities: "beam:coder:row:v1" capabilities:
"beam:coder:param_windowed_value:v1" capabilities:
"beam:coder:state_backed_iterable:v1" capabilities: "beam:coder:sharded_key:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1" capabilities:
"beam:protocol:progress_reporting:v1" capabilities:
"beam:version:sdk_base:apache/beam_java8_sdk:2.27.0.dev" capabilities:
"beam:transform:sdf_truncate_sized_restrictions:v1" dependencies { type_urn:
"beam:artifact:type:file:v1" type_payload:
"\n\244\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/1-EMBEDDED-icedtea-sound-Iwne2hzRY_LcFqVV9QV5vKlMjP51hQ-1eOLJzzvtpbI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n=icedtea-sound-Iwne2hzRY_LcFqVV9QV5vKlMjP51hQ-1eOLJzzvtpbI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\236\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/2-EMBEDDED-jaccess-GSuLz3csmJu-X9AUZiHTbJqpaMTUV566wbey67oiHdQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n7jaccess-GSuLz3csmJu-X9AUZiHTbJqpaMTUV566wbey67oiHdQ.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\241\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/3-EMBEDDED-localedata-TEwJCunHd18wxAwBm3LvJNJx12solPe-xMDYaFol2sA.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n:localedata-TEwJCunHd18wxAwBm3LvJNJx12solPe-xMDYaFol2sA.jar" } dependencies
{ type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\236\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/4-EMBEDDED-nashorn-SdgMcTpgWcQtx3JL51KmsDXzNAZxettGu89S8K56auc.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n7nashorn-SdgMcTpgWcQtx3JL51KmsDXzNAZxettGu89S8K56auc.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\237\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/5-EMBEDDED-cldrdata-OUiyQiNqFFtu58zOPukvi1Butg3ZPuK250n_3RmVExU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n8cldrdata-OUiyQiNqFFtu58zOPukvi1Butg3ZPuK250n_3RmVExU.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\234\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/6-EMBEDDED-dnsns-R-BZtd9v6E5wbD8eGFy7fqsxltVReeEAFsXCJxHGNrw.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n5dnsns-R-BZtd9v6E5wbD8eGFy7fqsxltVReeEAFsXCJxHGNrw.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\244\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/7-EMBEDDED-gradle-worker-qf5IiwZXu67peCt-moRUDeA9ftsB4QLMU30y-NdWzC8.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n=gradle-worker-qf5IiwZXu67peCt-moRUDeA9ftsB4QLMU30y-NdWzC8.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\277\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/8-EMBEDDED-beam-runners-spark-2.27.0-SNAPSHOT-tests-VIUwWCUBH9VYrRfgJ3JgkXOH_rL8lwAXCrO3No5eqhs.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nXbeam-runners-spark-2.27.0-SNAPSHOT-tests-VIUwWCUBH9VYrRfgJ3JgkXOH_rL8lwAXCrO3No5eqhs.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\271\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/9-EMBEDDED-beam-runners-spark-2.27.0-SNAPSHOT-roDcluFjvJNigr8KNT1mYws9zKKLnT9KBmLXJGHrakA.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nRbeam-runners-spark-2.27.0-SNAPSHOT-roDcluFjvJNigr8KNT1mYws9zKKLnT9KBmLXJGHrakA.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/10-EMBEDDED-beam-runners-portability-java-2.27.0-SNAPSHOT-tests-k6rmbO4nQYSnVfGyVKueSIwhszdOeM0Ln-vO"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\ncbeam-runners-portability-java-2.27.0-SNAPSHOT-tests-k6rmbO4nQYSnVfGyVKueSIwhszdOeM0Ln-vOOb1MPmI.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/11-EMBEDDED-beam-runners-portability-java-2.27.0-SNAPSHOT-rnaz2uH7bTuALGVp2OiUkziXEVGgWgkfy2FYrt0Gd4"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n]beam-runners-portability-java-2.27.0-SNAPSHOT-rnaz2uH7bTuALGVp2OiUkziXEVGgWgkfy2FYrt0Gd4w.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/12-EMBEDDED-beam-sdks-java-harness-2.27.0-SNAPSHOT-unshaded-nYh58N3Io3bOmnY4RSS60khe4fFu_STm-bLYDeed"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n_beam-sdks-java-harness-2.27.0-SNAPSHOT-unshaded-nYh58N3Io3bOmnY4RSS60khe4fFu_STm-bLYDeedV84.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\276\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/13-EMBEDDED-beam-sdks-java-harness-2.27.0-SNAPSHOT-gh6Mx8dlVqTJlqeMiGJpGfwQHxVyXfv71JfZxT9XlAs.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nVbeam-sdks-java-harness-2.27.0-SNAPSHOT-gh6Mx8dlVqTJlqeMiGJpGfwQHxVyXfv71JfZxT9XlAs.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/14-EMBEDDED-beam-runners-core-java-2.27.0-SNAPSHOT-tests-RDLdZGC6JzZSXd5C3QOfmvw0_5AsazsorSCUoRhjukg"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n\\beam-runners-core-java-2.27.0-SNAPSHOT-tests-RDLdZGC6JzZSXd5C3QOfmvw0_5AsazsorSCUoRhjukg.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\276\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/15-EMBEDDED-beam-runners-core-java-2.27.0-SNAPSHOT-EoJ25ciIZMjsAsXPS5rF9lMtKKNpvgoVfoiRRL_Y070.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nVbeam-runners-core-java-2.27.0-SNAPSHOT-EoJ25ciIZMjsAsXPS5rF9lMtKKNpvgoVfoiRRL_Y070.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/16-EMBEDDED-beam-runners-java-job-service-2.27.0-SNAPSHOT-uX1c_5frzX2Xl8Dxwhv74l1LtK_tJN0lnW47YmuX_m"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n]beam-runners-java-job-service-2.27.0-SNAPSHOT-uX1c_5frzX2Xl8Dxwhv74l1LtK_tJN0lnW47YmuX_mg.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\277\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/17-EMBEDDED-beam-sdks-java-io-kafka-2.27.0-SNAPSHOT--3tpjyY5fnm6MnYASABrYhNPATm_9NQWNJ5O0OzNd9g.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nWbeam-sdks-java-io-kafka-2.27.0-SNAPSHOT--3tpjyY5fnm6MnYASABrYhNPATm_9NQWNJ5O0OzNd9g.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/18-EMBEDDED-beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-Pq7-h5rVFSiymvpGKRHg_tR7ri-Pw5Fpm-Oku_A"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n`beam-sdks-java-expansion-service-2.27.0-SNAPSHOT-Pq7-h5rVFSiymvpGKRHg_tR7ri-Pw5Fpm-Oku_AFSf0.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/19-EMBEDDED-beam-runners-java-fn-execution-2.27.0-SNAPSHOT-F7yo2w8-i6VnpVD9viLcI04VkkboYBM2c7mSZqfHp"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n^beam-runners-java-fn-execution-2.27.0-SNAPSHOT-F7yo2w8-i6VnpVD9viLcI04VkkboYBM2c7mSZqfHpLc.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/20-EMBEDDED-beam-runners-core-construction-java-2.27.0-SNAPSHOT-CIMmn5THHKVYsTOdd9ZB67FVO6Ru7B_q48Of"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\ncbeam-runners-core-construction-java-2.27.0-SNAPSHOT-CIMmn5THHKVYsTOdd9ZB67FVO6Ru7B_q48Of6usYPc4.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/21-EMBEDDED-beam-runners-core-construction-java-2.27.0-SNAPSHOT-tests-O6I6jZM0kJWyiJHL-qHgPRFSUwXEEj"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nibeam-runners-core-construction-java-2.27.0-SNAPSHOT-tests-O6I6jZM0kJWyiJHL-qHgPRFSUwXEEj-oJVNOB4027p4.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/22-EMBEDDED-beam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-RGTlUSZ9ATn9KQPtZ4Q"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\ntbeam-sdks-java-extensions-google-cloud-platform-core-2.27.0-SNAPSHOT-RGTlUSZ9ATn9KQPtZ4QuP0pbls8RBTH-gdR-EFJhkYM.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/23-EMBEDDED-beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-Pmfe35mZE8zLMJpjz-sptwayCpdMEM8Z04gJSvwMALs."
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n[beam-sdks-java-fn-execution-2.27.0-SNAPSHOT-Pmfe35mZE8zLMJpjz-sptwayCpdMEM8Z04gJSvwMALs.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/24-EMBEDDED-beam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-AYlO4ihGFJdqvbnpluofQsA8Z1QEUN"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nibeam-vendor-sdks-java-extensions-protobuf-2.27.0-SNAPSHOT-AYlO4ihGFJdqvbnpluofQsA8Z1QEUN-0HOitHv_f_ww.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/25-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-tests-FB4KyxowVLZuqUVDK6zQ0kXL6qK9mF6mrJARQCHcw10.ja"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nYbeam-sdks-java-core-2.27.0-SNAPSHOT-tests-FB4KyxowVLZuqUVDK6zQ0kXL6qK9mF6mrJARQCHcw10.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\273\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/26-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-UrazWBt9fU5ArRv6sZ1H15t_HWtduxqPkHElrCI0asI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nSbeam-sdks-java-core-2.27.0-SNAPSHOT-UrazWBt9fU5ArRv6sZ1H15t_HWtduxqPkHElrCI0asI.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/27-EMBEDDED-beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-A11QO_xoqKlDnci-I1cfTYwtNTqmLjC18EwBHjNmHJg"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n\\beam-sdks-java-core-2.27.0-SNAPSHOT-unshaded-A11QO_xoqKlDnci-I1cfTYwtNTqmLjC18EwBHjNmHJg.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\256\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/28-EMBEDDED-guava-testlib-25.1-jre-Gs-YhZbSzRrSwEz1JPl5w-hUaJjNF3C9wAfy8mjXtmQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nFguava-testlib-25.1-jre-Gs-YhZbSzRrSwEz1JPl5w-hUaJjNF3C9wAfy8mjXtmQ.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/29-EMBEDDED-spark-sql_2.11-2.4.7-CsTKJqhM5x-U_FGx7jDozW9Fht4c_cDu74ZBnyGX6fw.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nDspark-sql_2.11-2.4.7-CsTKJqhM5x-U_FGx7jDozW9Fht4c_cDu74ZBnyGX6fw.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\262\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/30-EMBEDDED-spark-streaming_2.11-2.4.7-ue_7LAg7Qr813xkNgtQfU9eAIy07ReCHArqpz9Cv0Ew.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nJspark-streaming_2.11-2.4.7-ue_7LAg7Qr813xkNgtQfU9eAIy07ReCHArqpz9Cv0Ew.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\261\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/31-EMBEDDED-spark-catalyst_2.11-2.4.7-HPCOgFf3FaqeYmff418i_Br9I2v89E_YDyEEJizcIU0.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nIspark-catalyst_2.11-2.4.7-HPCOgFf3FaqeYmff418i_Br9I2v89E_YDyEEJizcIU0.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\255\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/32-EMBEDDED-spark-core_2.11-2.4.7-kfYudFrbOp2tu2iliTUL91CGXawNYRvw3hHz-VnbUFI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nEspark-core_2.11-2.4.7-kfYudFrbOp2tu2iliTUL91CGXawNYRvw3hHz-VnbUFI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/33-EMBEDDED-hadoop-client-2.10.1-bZMdxnHBprJoceVSpimHjHyDAlR_8INTS5Ku1LylsAI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nDhadoop-client-2.10.1-bZMdxnHBprJoceVSpimHjHyDAlR_8INTS5Ku1LylsAI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\272\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/34-EMBEDDED-hadoop-mapreduce-client-app-2.10.1-NH4nvQw_Czc4M4WzRq2tj6-1cFEdzq5PBdwN4QG4RBE.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nRhadoop-mapreduce-client-app-2.10.1-NH4nvQw_Czc4M4WzRq2tj6-1cFEdzq5PBdwN4QG4RBE.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/35-EMBEDDED-hadoop-mapreduce-client-jobclient-2.10.1-nfZ_Ocdnp24qWUZ6RM9JQ3wro8n2JRrCanuecVq7Ubg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nXhadoop-mapreduce-client-jobclient-2.10.1-nfZ_Ocdnp24qWUZ6RM9JQ3wro8n2JRrCanuecVq7Ubg.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\276\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/36-EMBEDDED-hadoop-mapreduce-client-shuffle-2.10.1-UTrRBwBQKG40z_CG64RfISUBsktoZ6cl7KEunywMYfQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nVhadoop-mapreduce-client-shuffle-2.10.1-UTrRBwBQKG40z_CG64RfISUBsktoZ6cl7KEunywMYfQ.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\275\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/37-EMBEDDED-hadoop-yarn-server-nodemanager-2.10.1-tgNbEbkCuTUhAYpNN5XJZnAX1iNurbPv99ysUI6cCaU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nUhadoop-yarn-server-nodemanager-2.10.1-tgNbEbkCuTUhAYpNN5XJZnAX1iNurbPv99ysUI6cCaU.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\270\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/38-EMBEDDED-hadoop-yarn-server-common-2.10.1-pvfLtr8jdHR_M2SpzglErT2BgBRsUdZgeYTk4Kr02IA.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nPhadoop-yarn-server-common-2.10.1-pvfLtr8jdHR_M2SpzglErT2BgBRsUdZgeYTk4Kr02IA.j
...[truncated 89467 chars]... U-NfWsSAgewbwoxd-bMqxLfbIIU-Wo52v3sO0.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\246\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/263-EMBEDDED-janino-3.0.16-9h24Y75jpbOBXYwclGRugdEWiJXv3lrNNIPCQiG5xlU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n=janino-3.0.16-9h24Y75jpbOBXYwclGRugdEWiJXv3lrNNIPCQiG5xlU.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\260\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/264-EMBEDDED-commons-compiler-3.0.16-C4BjaTC2IZexotsEriG57Vz6yXFctE7dbcb2dfQXlwg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nGcommons-compiler-3.0.16-C4BjaTC2IZexotsEriG57Vz6yXFctE7dbcb2dfQXlwg.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\243\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/265-EMBEDDED-antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n:antlr4-4.7-eGclcCizNzrwEd7nts6bWHqP1cegsl9osv9MuQvoqgc.jar" } dependencies
{ type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\253\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/266-EMBEDDED-antlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nBantlr4-runtime-4.7-KmGUP4A7vR0OAt_9GbkqQY-DNAyZQ0aAnjtR4iMapsA.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\253\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/267-EMBEDDED-aircompressor-0.10-pUcavdyZqVk5q_wEBc3bIhPE-6Vh3pT4iNbmJVZugmw.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nBaircompressor-0.10-pUcavdyZqVk5q_wEBc3bIhPE-6Vh3pT4iNbmJVZugmw.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\257\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/268-EMBEDDED-parquet-jackson-1.10.1-m8RDI886Nr-xqxl_W48rE6OiYTuqBIm7JSszVTVipSg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nFparquet-jackson-1.10.1-m8RDI886Nr-xqxl_W48rE6OiYTuqBIm7JSszVTVipSg.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/269-EMBEDDED-arrow-format-0.10.0-ITh71gEtmLvHCD80n5Vp3EeYzXLFt8mqcTCSu84ZOes.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCarrow-format-0.10.0-ITh71gEtmLvHCD80n5Vp3EeYzXLFt8mqcTCSu84ZOes.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\243\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/270-EMBEDDED-hppc-0.7.2-ez3WZh6D4xPXC0qoLFGAuzlTXlNqNDX6dB__lydDO2o.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n:hppc-0.7.2-ez3WZh6D4xPXC0qoLFGAuzlTXlNqNDX6dB__lydDO2o.jar" } dependencies
{ type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\263\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/271-EMBEDDED-flatbuffers-1.2.0-3f79e055-dD-XMWCWum6FKJFOorBi9qAvyR7HPJilpGJA1tZ-aJg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nJflatbuffers-1.2.0-3f79e055-dD-XMWCWum6FKJFOorBi9qAvyR7HPJilpGJA1tZ-aJg.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\270\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/272-EMBEDDED-animal-sniffer-annotations-1.18-R_BYUrSO6brv74D6PYzqYO-kdTwAExId1_5e7y5ccp0.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nOanimal-sniffer-annotations-1.18-R_BYUrSO6brv74D6PYzqYO-kdTwAExId1_5e7y5ccp0.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\250\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/273-EMBEDDED-httpcore-4.4.13-4G6J1AlDJF_Po57FN82_zjdirs3o-cWXeA0rAMK0NCQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n?httpcore-4.4.13-4G6J1AlDJF_Po57FN82_zjdirs3o-cWXeA0rAMK0NCQ.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\245\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/274-EMBEDDED-jettison-1.1-N3lAKIsGQ8SHgBN_b2hXiTfh6lyitzgwqCDFCnt-2AE.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n<jettison-1.1-N3lAKIsGQ8SHgBN_b2hXiTfh6lyitzgwqCDFCnt-2AE.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\250\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/275-EMBEDDED-jaxb-impl-2.3.3-5ReNDHlIJH91oTxom_NvTV1JEKEh9xKqOyCulDdwadg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n?jaxb-impl-2.3.3-5ReNDHlIJH91oTxom_NvTV1JEKEh9xKqOyCulDdwadg.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\240\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/276-EMBEDDED-asm-3.1-Mz_1NpBDl1t-AxuLJyBpN0QYVHOOA4wfR_mNByogQ3o.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n7asm-3.1-Mz_1NpBDl1t-AxuLJyBpN0QYVHOOA4wfR_mNByogQ3o.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/277-EMBEDDED-java-xmlbuilder-0.4-aB5TxP_Vn6EgaIA7JZ46g9Q_B6R8ES50ihh97heesx8.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCjava-xmlbuilder-0.4-aB5TxP_Vn6EgaIA7JZ46g9Q_B6R8ES50ihh97heesx8.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/278-EMBEDDED-nimbus-jose-jwt-7.9-tPWEU-GAqYHrdEoZtNVq-xLxDD3TXnUxzFjubL9b8oY.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCnimbus-jose-jwt-7.9-tPWEU-GAqYHrdEoZtNVq-xLxDD3TXnUxzFjubL9b8oY.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\247\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/279-EMBEDDED-json-smart-2.3-kD9IyKpMP2QmRAuNMt6J-h3COxFpq94l5OHQaKpncIs.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n>json-smart-2.3-kD9IyKpMP2QmRAuNMt6J-h3COxFpq94l5OHQaKpncIs.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/280-EMBEDDED-accessors-smart-1.2-DHwmXWL8AHEk3DK5EzbpxCcmUdYpvF-hpOTjvHWOsuQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCaccessors-smart-1.2-DHwmXWL8AHEk3DK5EzbpxCcmUdYpvF-hpOTjvHWOsuQ.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\242\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/281-EMBEDDED-asm-5.0.4-iWYY7YrmJwJSGni8e-QrfEkaCOaSChX4mj7N7DHpoiA.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n9asm-5.0.4-iWYY7YrmJwJSGni8e-QrfEkaCOaSChX4mj7N7DHpoiA.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\242\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/282-EMBEDDED-ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n9ST4-4.0.8-WMqrxAyfdLC1mT_YaOD2SlDAdZCU5qJRqq-tmO38ejs.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/283-EMBEDDED-antlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCantlr-runtime-3.5.2-zj_I7LEPOemjzdy7LONQ0nLZzT0LHhjm_nPDuTichzQ.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/284-EMBEDDED-javassist-3.20.0-GA-12kQYvt3nCOBZAyPcqy6LCOHOwHCQ4ZtQcFdxMiEjqI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCjavassist-3.20.0-GA-12kQYvt3nCOBZAyPcqy6LCOHOwHCQ4ZtQcFdxMiEjqI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\246\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/285-EMBEDDED-flogger-0.5.1-tezRSD4EEZcBJ4b3SZaKYgY8GWTT7Pv5a6kqlXl7uPU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n=flogger-0.5.1-tezRSD4EEZcBJ4b3SZaKYgY8GWTT7Pv5a6kqlXl7uPU.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\262\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/286-EMBEDDED-checker-compat-qual-2.5.3-12ua_qYcfAgpCAI_DLwUJ_q5q9LfkVyLij56UJvMvG0.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nIchecker-compat-qual-2.5.3-12ua_qYcfAgpCAI_DLwUJ_q5q9LfkVyLij56UJvMvG0.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\261\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/287-EMBEDDED-netty-codec-4.1.51.Final-_3QaqjX3BIpr58cAqkhRv2Q5F2SOpbfAy62i84SMK-4.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nHnetty-codec-4.1.51.Final-_3QaqjX3BIpr58cAqkhRv2Q5F2SOpbfAy62i84SMK-4.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\300\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/288-EMBEDDED-netty-transport-native-unix-common-4.1.51.Final-FHWV_0ViQv0bMtHmzXgKZluNjgOk4b9W2ot3Kzh"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n_netty-transport-native-unix-common-4.1.51.Final-FHWV_0ViQv0bMtHmzXgKZluNjgOk4b9W2ot3Kzhjxtk.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\265\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/289-EMBEDDED-netty-transport-4.1.51.Final-5b4lnzWiRr9QStk-qPXfMYcrWr6_t1E4DquV1dyEDUQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nLnetty-transport-4.1.51.Final-5b4lnzWiRr9QStk-qPXfMYcrWr6_t1E4DquV1dyEDUQ.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\264\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/290-EMBEDDED-netty-resolver-4.1.51.Final-yKd3ZeSB-_WQbFlutEHeSQlrNUvK4DVrdASsXpY5k1A.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nKnetty-resolver-4.1.51.Final-yKd3ZeSB-_WQbFlutEHeSQlrNUvK4DVrdASsXpY5k1A.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\262\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/291-EMBEDDED-netty-buffer-4.1.51.Final-w8O3EOG1qN89YM1GAuCnQ0gdXmCeSqhS-iYp5OQS0kU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nInetty-buffer-4.1.51.Final-w8O3EOG1qN89YM1GAuCnQ0gdXmCeSqhS-iYp5OQS0kU.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\262\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/292-EMBEDDED-netty-common-4.1.51.Final-EQ4GUV9DkTorusI-GqeLf1muCdRmsAr1_POZpPmvG2s.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nInetty-common-4.1.51.Final-EQ4GUV9DkTorusI-GqeLf1muCdRmsAr1_POZpPmvG2s.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\245\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/293-EMBEDDED-minlog-1.3.0-97OZ06VHik8-DZi9HJ9HdmEZxmQUvDOqD2zeAGbyTMI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n<minlog-1.3.0-97OZ06VHik8-DZi9HJ9HdmEZxmQUvDOqD2zeAGbyTMI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\245\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/294-EMBEDDED-okhttp-2.7.5-iKyf0btR-CvMZkzB65wiXJDcQ4nWYCMbTMc3vr_n0Ko.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n<okhttp-2.7.5-iKyf0btR-CvMZkzB65wiXJDcQ4nWYCMbTMc3vr_n0Ko.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/295-EMBEDDED-hk2-utils-2.4.0-b34-cCEbH5GIGb9q-_adPRnUrm4qddbib2w5up8g645WEtc.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nChk2-utils-2.4.0-b34-cCEbH5GIGb9q-_adPRnUrm4qddbib2w5up8g645WEtc.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\271\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/296-EMBEDDED-aopalliance-repackaged-2.4.0-b34-XTywzs5yLHuoq5h7kxBTzbywyxKtXIyKdpHrb35gpks.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nPaopalliance-repackaged-2.4.0-b34-XTywzs5yLHuoq5h7kxBTzbywyxKtXIyKdpHrb35gpks.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\261\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/297-EMBEDDED-servlet-api-2.5-20081211-BodWCWmW_gD2BKw7ZnLW9mPcd36kqDBW4kDQRW535HI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nHservlet-api-2.5-20081211-BodWCWmW_gD2BKw7ZnLW9mPcd36kqDBW4kDQRW535HI.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\263\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/298-EMBEDDED-jakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nJjakarta.xml.bind-api-2.3.3-wEU59HLppt0MdoXqgtZ3KCJpq457rKLhRQDjgeDGzsU.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\261\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/299-EMBEDDED-jakarta.activation-1.2.2-AhVnc-SunQSNFKVq011kS-6fEFKnkdBy3z3tPGVubho.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nHjakarta.activation-1.2.2-AhVnc-SunQSNFKVq011kS-6fEFKnkdBy3z3tPGVubho.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\257\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/300-EMBEDDED-jcip-annotations-1.0-1-T8z_g4Kq_FiZYsTtsmL2qlleNPHhHmEFfRxqluj8cyM.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nFjcip-annotations-1.0-1-T8z_g4Kq_FiZYsTtsmL2qlleNPHhHmEFfRxqluj8cyM.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\270\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/301-EMBEDDED-org.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nOorg.abego.treelayout.core-1.0.3--l4xOVw5wufUasoPgfcgYJMWB7L6Qb02A46yy2-5MyY.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\251\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/302-EMBEDDED-javax.json-1.0.4-Dh3sQKHt6WWUElHtqWiu7gUsxPUDeLwxbMSOgVm9vrQ.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n...@javax.json-1.0.4-dh3sqkht6wwuelhtqwiu7gusxpudelwxbmsogvm9vrq.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\243\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/303-EMBEDDED-icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n:icu4j-58.2-lT4eg7K-fD6i-I2obBNhT0fp5x01eMhSHX8Yd1a2OWI.jar" } dependencies
{ type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\254\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/304-EMBEDDED-annotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nCannotations-4.1.1.4-unNOHoTAnWFa9qCdMwNLTwRC-Hct7BIO-zdthqVlrhU.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\243\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/305-EMBEDDED-okio-1.6.0-EUvcH0czimi8vJWr8vXNxyvu7JGBLy_Ne1IcGTeHYmY.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n:okio-1.6.0-EUvcH0czimi8vJWr8vXNxyvu7JGBLy_Ne1IcGTeHYmY.jar" } dependencies
{ type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\247\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/306-EMBEDDED-stax-api-1.0-2-6McOvXb5gslYKoLvgs9s4Up9WKSk3KXLe3_JiMgAibc.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n>stax-api-1.0-2-6McOvXb5gslYKoLvgs9s4Up9WKSk3KXLe3_JiMgAibc.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\242\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/307-EMBEDDED-guice-3.0-GlnQQh_9NVzAtwtC3xwumvdEyKLQyS2jefX8ovB_HSI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n9guice-3.0-GlnQQh_9NVzAtwtC3xwumvdEyKLQyS2jefX8ovB_HSI.jar" } dependencies {
type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\247\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/308-EMBEDDED-javax.inject-1-kcdwRKUMSBY2wy2Rb9ickRinIZU5BFLIEGUID5V95_8.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n>javax.inject-1-kcdwRKUMSBY2wy2Rb9ickRinIZU5BFLIEGUID5V95_8.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\275\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/309-EMBEDDED-geronimo-jcache_1.0_spec-1.0-alpha-1-AHChLlj0kblXGTkTJSmaYpRTDubDziXlC9yYsLcAlmw.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nTgeronimo-jcache_1.0_spec-1.0-alpha-1-AHChLlj0kblXGTkTJSmaYpRTDubDziXlC9yYsLcAlmw.jar"
} dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\256\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/310-EMBEDDED-mssql-jdbc-6.2.1.jre7-nPollFCuNHHS5uLD2K78ziNuPa74s3NNIdyTw6W76AY.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nEmssql-jdbc-6.2.1.jre7-nPollFCuNHHS5uLD2K78ziNuPa74s3NNIdyTw6W76AY.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\250\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/311-EMBEDDED-aopalliance-1.0-Ct3sZw_tzT8RPFyAkdeDKA0j9146y4QbYanNsHk3agg.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\n?aopalliance-1.0-Ct3sZw_tzT8RPFyAkdeDKA0j9146y4QbYanNsHk3agg.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\256\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/312-EMBEDDED-cglib-2.2.1-v20090111-QuHfsmvsvxpjPyW0fjn8xCK4XnfkwEaNmkT4hfX6C-I.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nEcglib-2.2.1-v20090111-QuHfsmvsvxpjPyW0fjn8xCK4XnfkwEaNmkT4hfX6C-I.jar" }
dependencies { type_urn: "beam:artifact:type:file:v1" type_payload:
"\n\260\001/tmp/beam-artifact-staging/7bc3a43ef2491080d3e84fa50f12d0dfb3d5a8aed6dd14e4a9373fce9e6775d1/313-EMBEDDED-byte-buddy-agent-1.9.10-jtc50pEyEDJQ0wfS6OPJXwdYjvBUOrEdKIHQB2il4YI.jar"
role_urn: "beam:artifact:role:staging_to:v1" role_payload:
"\nGbyte-buddy-agent-1.9.10-jtc50pEyEDJQ0wfS6OPJXwdYjvBUOrEdKIHQB2il4YI.jar" }
20/12/16 12:35:23 INFO
org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn
Logging clients still connected during shutdown. 20/12/16 12:35:23 WARN
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown
endpoint. 20/12/16 12:35:23 ERROR
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Failed to handle for
url: "InProcessServer_310"
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: CANCELLED:
Multiplexer hanging up at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Status.asRuntimeException(Status.java:533)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ClientCalls$StreamObserverToCallListenerAdapter.onClose(ClientCalls.java:449)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:700)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:399)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:521)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:66)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:641)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$700(ClientCallImpl.java:529)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInternal(ClientCallImpl.java:703)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:692)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 20/12/16 12:35:23 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122101000 20/12/16 12:35:23
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122101000 has completed, watermarks have been
updated. 20/12/16 12:35:23 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122101500 20/12/16 12:35:23
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122101500 has completed, watermarks have been
updated. 20/12/16 12:35:24 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122102000 20/12/16 12:35:24
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122102000 has completed, watermarks have been
updated. Dec 16, 2020 12:35:24 PM
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=2148,
target=directaddress:///InProcessServer_310} was not shutdown properly!!!
~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
allocation site at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at
org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.lambda$getClientFor$0(BeamFnDataGrpcClient.java:116)
at
java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
at
org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.getClientFor(BeamFnDataGrpcClient.java:110)
at
org.apache.beam.fn.harness.data.BeamFnDataGrpcClient.send(BeamFnDataGrpcClient.java:101)
at
org.apache.beam.fn.harness.data.QueueingBeamFnDataClient.send(QueueingBeamFnDataClient.java:141)
at
org.apache.beam.fn.harness.BeamFnDataWriteRunner.registerForOutput(BeamFnDataWriteRunner.java:169)
at
org.apache.beam.fn.harness.data.PTransformFunctionRegistry.lambda$register$0(PTransformFunctionRegistry.java:108)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:301)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 20/12/16 12:35:24 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122102500 20/12/16 12:35:24
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122102500 has completed, watermarks have been
updated. 20/12/16 12:35:24 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122103000 20/12/16 12:35:24
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122103000 has completed, watermarks have been
updated. 20/12/16 12:35:25 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122103500 20/12/16 12:35:25
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122103500 has completed, watermarks have been
updated. 20/12/16 12:35:25 INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder: No new watermarks
could be computed upon completion of batch: 1608122104000 20/12/16 12:35:25
INFO
org.apache.beam.runners.spark.util.GlobalWatermarkHolder$WatermarkAdvancingStreamingListener:
Batch with timestamp: 1608122104000 has completed, watermarks have been
updated. 20/12/16 12:35:25 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
finished. 20/12/16 12:35:25 WARN org.apache.spark.streaming.StreamingContext:
StreamingContext has already been stopped 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.InMemoryJobService: Getting job metrics
for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.InMemoryJobService: Finished getting job
metrics for
metricstest0attemptedmetrictests0testattempteddistributionmetrics-jenkins-1216123446-8e555f47_c392d334-e797-4697-9941-406d17eee2b0
20/12/16 12:35:26 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
JobServer stopped on localhost:32955 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingServer
stopped on localhost:45763 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Expansion stopped on
localhost:43129 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService
started on localhost:42201 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService
started on localhost:35099 20/12/16 12:35:26 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on
localhost:44235 20/12/16 12:35:28 INFO
org.apache.beam.runners.portability.PortableRunner: Using job server endpoint:
localhost:44235 20/12/16 12:35:28 INFO
org.apache.beam.runners.portability.PortableRunner: PrepareJobResponse:
preparation_id:
"metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a"
artifact_staging_endpoint { url: "localhost:42201" } staging_session_token:
"metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a"
20/12/16 12:35:28 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging
artifacts for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a.
20/12/16 12:35:28 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving
artifacts for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a.EMBEDDED.
20/12/16 12:35:28 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting
313 artifacts for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a.null.
20/12/16 12:35:29 INFO
org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts
fully staged for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_1decf27a-e215-4db0-83cb-902493998e4a.
Dec 16, 2020 12:35:29 PM
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=2144,
target=directaddress:///InProcessServer_311} was not shutdown properly!!!
~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
allocation site at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:89)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache$GrpcStateClient.<init>(BeamFnStateGrpcClientCache.java:79)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.createBeamFnStateClient(BeamFnStateGrpcClientCache.java:75)
at
java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
at
org.apache.beam.fn.harness.state.BeamFnStateGrpcClientCache.forApiServiceDescriptor(BeamFnStateGrpcClientCache.java:71)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.createBundleProcessor(ProcessBundleHandler.java:456)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.lambda$processBundle$0(ProcessBundleHandler.java:284)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler$BundleProcessorCache.get(ProcessBundleHandler.java:572)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:279)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:173)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.lambda$processInstructionRequests$0(BeamFnControlClient.java:157)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Dec 16, 2020 12:35:29 PM
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference
cleanQueue SEVERE: *~*~*~ Channel ManagedChannelImpl{logId=2140,
target=directaddress:///InProcessServer_306} was not shutdown properly!!!
~*~*~* Make sure to call shutdown()/shutdownNow() and wait until
awaitTermination() returns true. java.lang.RuntimeException: ManagedChannel
allocation site at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper$ManagedChannelReference.<init>(ManagedChannelOrphanWrapper.java:94)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:52)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ManagedChannelOrphanWrapper.<init>(ManagedChannelOrphanWrapper.java:43)
at
org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:524)
at
org.apache.beam.sdk.fn.channel.ManagedChannelFactory.forDescriptor(ManagedChannelFactory.java:44)
at org.apache.beam.fn.harness.FnHarness.main(FnHarness.java:194) at
org.apache.beam.runners.fnexecution.environment.EmbeddedEnvironmentFactory.lambda$createEnvironment$0(EmbeddedEnvironmentFactory.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) 20/12/16 12:35:29 INFO
org.apache.beam.runners.spark.SparkJobInvoker: Invoking job
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
20/12/16 12:35:29 INFO org.apache.beam.runners.jobsubmission.JobInvocation:
Starting job invocation
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
20/12/16 12:35:29 INFO org.apache.beam.runners.portability.PortableRunner:
RunJobResponse: job_id:
"metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae"
20/12/16 12:35:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will
stage 313 files. (Enable logging at DEBUG level to see which files will be
staged.) 20/12/16 12:35:29 INFO
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand
new Spark Context. 20/12/16 12:35:29 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
on Spark master local[4] 20/12/16 12:35:29 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Running job
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
on Spark master local[4] 20/12/16 12:35:29 INFO
org.apache.beam.runners.spark.SparkPipelineRunner: Job
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae:
Pipeline translated successfully. Computing outputs 20/12/16 12:35:29 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:29 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:30 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:30 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:31 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:31 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:32 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:32 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:33 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:33 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:34 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:34 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:35 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:35 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:36 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:36 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:37 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:37 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:38 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:38 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:39 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:39 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:40 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:40 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:41 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:41 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:42 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:42 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:43 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:43 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:44 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:44 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:45 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:45 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:46 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:46 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:47 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:47 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:48 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:48 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:49 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:49 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:50 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:50 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:51 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:51 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:52 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:52 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:53 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:53 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:54 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:54 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:55 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:55 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:56 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:56 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:57 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:57 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:58 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:58 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:59 WARN
org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't
support checkpointing 20/12/16 12:35:59 WARN
org.apache.spark.streaming.util.BatchedWriteAheadLog: BatchedWriteAheadLog
Writer queue interrupted. Exception in thread "streaming-job-executor-0"
java.lang.Error: java.lang.InterruptedException at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by:
java.lang.InterruptedException at
java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998)
at
java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:206) at
scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:222) at
scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:157) at
org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:243) at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:750) at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) at
org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) at
org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972) at
org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970) at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) at
org.apache.spark.rdd.RDD.foreach(RDD.scala:970) at
org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:351) at
org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45) at
org.apache.beam.runners.spark.translation.streaming.UnboundedDataset.lambda$action$e3b46054$1(UnboundedDataset.java:79)
at
org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at
org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at
org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at
org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at
org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192) at
org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at
org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
... 2 more 20/12/16 12:36:01 ERROR org.apache.spark.executor.Executor:
Exception in task 1.0 in stage 0.0 (TID 1):
/tmp/spark-9835f7a8-e2a0-43f8-8f9a-2d72a22d963b/userFiles-a825e30f-1f46-4120-90df-a67d711d2bff/fetchFileTemp79690841683280870.tmp
20/12/16 12:36:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
finished. 20/12/16 12:36:01 WARN org.apache.spark.streaming.StreamingContext:
StreamingContext has already been stopped 20/12/16 12:36:01 ERROR
org.apache.spark.executor.Executor: Exception in task 3.0 in stage 0.0 (TID 3):
Cannot retrieve files with 'spark' scheme without an active SparkEnv. 20/12/16
12:36:01 ERROR org.apache.spark.executor.Executor: Exception in task 2.0 in
stage 0.0 (TID 2): null 20/12/16 12:36:01 ERROR
org.apache.spark.executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0):
null 20/12/16 12:36:02 INFO
org.apache.beam.runners.jobsubmission.InMemoryJobService: Getting job metrics
for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
20/12/16 12:36:02 INFO
org.apache.beam.runners.jobsubmission.InMemoryJobService: Finished getting job
metrics for
metricstest0attemptedmetrictests0testattemptedcountermetrics-jenkins-1216123528-50b3213a_559c72f8-be73-40f7-9f31-eb235e4ae4ae
20/12/16 12:36:02 INFO org.apache.beam.runners.jobsubmission.JobServerDriver:
JobServer stopped on localhost:44235 20/12/16 12:36:02 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingServer
stopped on localhost:42201 20/12/16 12:36:02 INFO
org.apache.beam.runners.jobsubmission.JobServerDriver: Expansion stopped on
localhost:35099
{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)