[ https://issues.apache.org/jira/browse/FLINK-17092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17081616#comment-17081616 ]
Dian Fu commented on FLINK-17092: --------------------------------- Update: there are two test failures: BlinkStreamDependencyTests and StreamPandasUDFITTests, however, the root cause is the test failure of BlinkStreamDependencyTests. The collected execution results of StreamPandasUDFITTests(which is stored in a static field at the client side) is mixed with the execution results of BlinkStreamDependencyTests as the execution results of BlinkStreamDependencyTests is not cleared correctly when it fails. (Maybe we should improve the tests to avoid this kind of confuse). > Pyflink failure for BlinkStreamDependencyTests and StreamPandasUDFITTests > ------------------------------------------------------------------------- > > Key: FLINK-17092 > URL: https://issues.apache.org/jira/browse/FLINK-17092 > Project: Flink > Issue Type: Bug > Components: API / Python, Tests > Reporter: Zhijiang > Priority: Major > Labels: test-stability > Fix For: 1.11.0 > > > Build: > [https://dev.azure.com/rmetzger/Flink/_build/results?buildId=7324&view=logs&j=9cada3cb-c1d3-5621-16da-0f718fb86602&t=14487301-07d2-5d56-5690-6dfab9ffd4d9] > logs > {code:java} > 2020-04-10T13:05:25.7259119Z E : > java.util.concurrent.ExecutionException: > org.apache.flink.runtime.client.JobExecutionException: Job execution failed. > 2020-04-10T13:05:25.7259755Z E at > java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) > 2020-04-10T13:05:25.7260301Z E at > java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908) > 2020-04-10T13:05:25.7260927Z E at > org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1663) > 2020-04-10T13:05:25.7261772Z E at > org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74) > 2020-04-10T13:05:25.7262405Z E at > org.apache.flink.table.planner.delegation.ExecutorBase.execute(ExecutorBase.java:51) > 2020-04-10T13:05:25.7263073Z E at > org.apache.flink.table.api.internal.TableEnvironmentImpl.execute(TableEnvironmentImpl.java:719) > 2020-04-10T13:05:25.7263588Z E at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > 2020-04-10T13:05:25.7264090Z E at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > 2020-04-10T13:05:25.7264668Z E at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > 2020-04-10T13:05:25.7265175Z E at > java.lang.reflect.Method.invoke(Method.java:498) > 2020-04-10T13:05:25.7265807Z E at > org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) > 2020-04-10T13:05:25.7266445Z E at > org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) > 2020-04-10T13:05:25.7267288Z E at > org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282) > 2020-04-10T13:05:25.7267897Z E at > org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) > 2020-04-10T13:05:25.7268518Z E at > org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79) > 2020-04-10T13:05:25.7269130Z E at > org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238) > 2020-04-10T13:05:25.7269623Z E at > java.lang.Thread.run(Thread.java:748) > 2020-04-10T13:05:25.7270112Z E Caused by: > org.apache.flink.runtime.client.JobExecutionException: Job execution failed. > 2020-04-10T13:05:25.7270700Z E at > org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147) > 2020-04-10T13:05:25.7271406Z E at > org.apache.flink.client.program.PerJobMiniClusterFactory$PerJobMiniClusterJobClient.lambda$getJobExecutionResult$2(PerJobMiniClusterFactory.java:175) > 2020-04-10T13:05:25.7272111Z E at > java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616) > 2020-04-10T13:05:25.7272665Z E at > java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591) > 2020-04-10T13:05:25.7273245Z E at > java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488) > 2020-04-10T13:05:25.7273909Z E at > java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975) > 2020-04-10T13:05:25.7274514Z E at > org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$0(AkkaInvocationHandler.java:229) > 2020-04-10T13:05:25.7275147Z E at > java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774) > 2020-04-10T13:05:25.7275800Z E at > java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750) > 2020-04-10T13:05:25.7276447Z E at > java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488) > 2020-04-10T13:05:25.7277239Z E at > java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975) > 2020-04-10T13:05:25.7277805Z E at > org.apache.flink.runtime.concurrent.FutureUtils$1.onComplete(FutureUtils.java:874) > 2020-04-10T13:05:25.7278328Z E at > akka.dispatch.OnComplete.internal(Future.scala:264) > 2020-04-10T13:05:25.7278804Z E at > akka.dispatch.OnComplete.internal(Future.scala:261) > 2020-04-10T13:05:25.7279258Z E at > akka.dispatch.japi$CallbackBridge.apply(Future.scala:191) > 2020-04-10T13:05:25.7279883Z E at > akka.dispatch.japi$CallbackBridge.apply(Future.scala:188) > 2020-04-10T13:05:25.7280352Z E at > scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36) > 2020-04-10T13:05:25.7280917Z E at > org.apache.flink.runtime.concurrent.Executors$DirectExecutionContext.execute(Executors.java:74) > 2020-04-10T13:05:25.7281501Z E at > scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:44) > 2020-04-10T13:05:25.7282029Z E at > scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:252) > 2020-04-10T13:05:25.7282546Z E at > akka.pattern.PromiseActorRef.$bang(AskSupport.scala:572) > 2020-04-10T13:05:25.7283089Z E at > akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:22) > 2020-04-10T13:05:25.7283728Z E at > akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:21) > 2020-04-10T13:05:25.7284305Z E at > scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:436) > 2020-04-10T13:05:25.7284811Z E at > scala.concurrent.Future$$anonfun$andThen$1.apply(Future.scala:435) > 2020-04-10T13:05:25.7285393Z E at > scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36) > 2020-04-10T13:05:25.7285917Z E at > akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55) > 2020-04-10T13:05:25.7286542Z E at > akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91) > 2020-04-10T13:05:25.7287470Z E at > akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91) > 2020-04-10T13:05:25.7288090Z E at > akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91) > 2020-04-10T13:05:25.7288679Z E at > scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72) > 2020-04-10T13:05:25.7289260Z E at > akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90) > 2020-04-10T13:05:25.7289790Z E at > akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) > 2020-04-10T13:05:25.7290372Z E at > akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44) > 2020-04-10T13:05:25.7290942Z E at > akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) > 2020-04-10T13:05:25.7291477Z E at > akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) > 2020-04-10T13:05:25.7292000Z E at > akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > 2020-04-10T13:05:25.7292640Z E at > akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > 2020-04-10T13:05:25.7293237Z E Caused by: > org.apache.flink.runtime.JobException: Recovery is suppressed by > NoRestartBackoffTimeStrategy > 2020-04-10T13:05:25.7293922Z E at > org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:112) > 2020-04-10T13:05:25.7294717Z E at > org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78) > 2020-04-10T13:05:25.7295505Z E at > org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:189) > 2020-04-10T13:05:25.7296138Z E at > org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:183) > 2020-04-10T13:05:25.7296934Z E at > org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:177) > 2020-04-10T13:05:25.7297700Z E at > org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:497) > 2020-04-10T13:05:25.7298415Z E at > org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:384) > 2020-04-10T13:05:25.7298933Z E at > sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source) > 2020-04-10T13:05:25.7299428Z E at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > 2020-04-10T13:05:25.7299950Z E at > java.lang.reflect.Method.invoke(Method.java:498) > 2020-04-10T13:05:25.7300468Z E at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284) > 2020-04-10T13:05:25.7301072Z E at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199) > 2020-04-10T13:05:25.7301695Z E at > org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74) > 2020-04-10T13:05:25.7302338Z E at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152) > 2020-04-10T13:05:25.7302886Z E at > akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) > 2020-04-10T13:05:25.7303385Z E at > akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) > 2020-04-10T13:05:25.7303872Z E at > scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123) > 2020-04-10T13:05:25.7304396Z E at > akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) > 2020-04-10T13:05:25.7304902Z E at > scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170) > 2020-04-10T13:05:25.7305487Z E at > scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) > 2020-04-10T13:05:25.7305991Z E at > scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) > 2020-04-10T13:05:25.7306481Z E at > akka.actor.Actor$class.aroundReceive(Actor.scala:517) > 2020-04-10T13:05:25.7307236Z E at > akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) > 2020-04-10T13:05:25.7307725Z E at > akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) > 2020-04-10T13:05:25.7308191Z E at > akka.actor.ActorCell.invoke(ActorCell.scala:561) > 2020-04-10T13:05:25.7308637Z E at > akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) > 2020-04-10T13:05:25.7309083Z E at > akka.dispatch.Mailbox.run(Mailbox.scala:225) > 2020-04-10T13:05:25.7309523Z E at > akka.dispatch.Mailbox.exec(Mailbox.scala:235) > 2020-04-10T13:05:25.7309849Z E ... 4 more > 2020-04-10T13:05:25.7311264Z E Caused by: > java.lang.RuntimeException: Failed to create stage bundle factory! > INFO:root:Initializing python harness: > /__w/3/s/flink-python/pyflink/fn_execution/boot.py --id=15-1 > --logging_endpoint=localhost:39057 --artifact_endpoint=localhost:32894 > --provision_endpoint=localhost:37718 --control_endpoint=localhost:40214 > 2020-04-10T13:05:25.7313317Z E INFO:root:Run command: > /__w/3/s/flink-python/.tox/py37/bin/python -m pip install --ignore-installed > -r > /tmp/blobStore-ac8e0f93-b8c7-47f7-a367-de17ad11dd74/job_00cbf232f30575e95b57561d7acbcbb0/blob_p-36ae24f7f10f4119f242d182384524abe3e0b284-fe9ee346d25f0564b02f2773d9520e29 > --prefix > /tmp/python-dist-e077c0d2-17c9-4590-81fc-724ff15f4552/python-requirements > 2020-04-10T13:05:25.7314128Z E > 2020-04-10T13:05:25.7315036Z E Collecting > cloudpickle==1.2.2 (from -r > /tmp/blobStore-ac8e0f93-b8c7-47f7-a367-de17ad11dd74/job_00cbf232f30575e95b57561d7acbcbb0/blob_p-36ae24f7f10f4119f242d182384524abe3e0b284-fe9ee346d25f0564b02f2773d9520e29 > (line 1)) > 2020-04-10T13:05:25.7316644Z E Could not find a version > that satisfies the requirement cloudpickle==1.2.2 (from -r > /tmp/blobStore-ac8e0f93-b8c7-47f7-a367-de17ad11dd74/job_00cbf232f30575e95b57561d7acbcbb0/blob_p-36ae24f7f10f4119f242d182384524abe3e0b284-fe9ee346d25f0564b02f2773d9520e29 > (line 1)) (from versions: ) > 2020-04-10T13:05:25.7318575Z E No matching distribution > found for cloudpickle==1.2.2 (from -r > /tmp/blobStore-ac8e0f93-b8c7-47f7-a367-de17ad11dd74/job_00cbf232f30575e95b57561d7acbcbb0/blob_p-36ae24f7f10f4119f242d182384524abe3e0b284-fe9ee346d25f0564b02f2773d9520e29 > (line 1)) > 2020-04-10T13:05:25.7319332Z E You are using pip version > 10.0.1, however version 20.0.2 is available. > 2020-04-10T13:05:25.7319953Z E You should consider > upgrading via the 'pip install --upgrade pip' command. > 2020-04-10T13:05:25.7320363Z E Traceback (most recent call > last): > 2020-04-10T13:05:25.7320978Z E File > "/__w/3/s/flink-python/dev/.conda/lib/python3.7/runpy.py", line 193, in > _run_module_as_main > 2020-04-10T13:05:25.7321415Z E "__main__", mod_spec) > 2020-04-10T13:05:25.7322006Z E File > "/__w/3/s/flink-python/dev/.conda/lib/python3.7/runpy.py", line 85, in > _run_code > 2020-04-10T13:05:25.7322403Z E exec(code, run_globals) > 2020-04-10T13:05:25.7322986Z E File > "/__w/3/s/flink-python/pyflink/fn_execution/boot.py", line 213, in <module> > 2020-04-10T13:05:25.7323375Z E > pip_install_requirements() > 2020-04-10T13:05:25.7323992Z E File > "/__w/3/s/flink-python/pyflink/fn_execution/boot.py", line 121, in > pip_install_requirements > 2020-04-10T13:05:25.7324462Z E (" > ".join(pip_install_commands), exit_code)) > 2020-04-10T13:05:25.7325920Z E Exception: Run command: > /__w/3/s/flink-python/.tox/py37/bin/python -m pip install --ignore-installed > -r > /tmp/blobStore-ac8e0f93-b8c7-47f7-a367-de17ad11dd74/job_00cbf232f30575e95b57561d7acbcbb0/blob_p-36ae24f7f10f4119f242d182384524abe3e0b284-fe9ee346d25f0564b02f2773d9520e29 > --prefix > /tmp/python-dist-e077c0d2-17c9-4590-81fc-724ff15f4552/python-requirements > error! exit code: 1 > 2020-04-10T13:05:25.7326930Z E > 2020-04-10T13:05:25.7327638Z E at > org.apache.flink.python.AbstractPythonFunctionRunner.createStageBundleFactory(AbstractPythonFunctionRunner.java:197) > 2020-04-10T13:05:25.7328311Z E at > org.apache.flink.python.AbstractPythonFunctionRunner.open(AbstractPythonFunctionRunner.java:164) > 2020-04-10T13:05:25.7329060Z E at > org.apache.flink.table.runtime.runners.python.scalar.AbstractGeneralPythonScalarFunctionRunner.open(AbstractGeneralPythonScalarFunctionRunner.java:65) > 2020-04-10T13:05:25.7329964Z E at > org.apache.flink.table.runtime.operators.python.AbstractStatelessFunctionOperator$ProjectUdfInputPythonScalarFunctionRunner.open(AbstractStatelessFunctionOperator.java:186) > 2020-04-10T13:05:25.7330906Z E at > org.apache.flink.streaming.api.operators.python.AbstractPythonFunctionOperator.open(AbstractPythonFunctionOperator.java:142) > 2020-04-10T13:05:25.7331676Z E at > org.apache.flink.table.runtime.operators.python.AbstractStatelessFunctionOperator.open(AbstractStatelessFunctionOperator.java:131) > 2020-04-10T13:05:25.7332543Z E at > org.apache.flink.table.runtime.operators.python.scalar.AbstractPythonScalarFunctionOperator.open(AbstractPythonScalarFunctionOperator.java:88) > 2020-04-10T13:05:25.7333381Z E at > org.apache.flink.table.runtime.operators.python.scalar.AbstractBaseRowPythonScalarFunctionOperator.open(AbstractBaseRowPythonScalarFunctionOperator.java:80) > 2020-04-10T13:05:25.7334227Z E at > org.apache.flink.table.runtime.operators.python.scalar.BaseRowPythonScalarFunctionOperator.open(BaseRowPythonScalarFunctionOperator.java:64) > 2020-04-10T13:05:25.7334983Z E at > org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:294) > 2020-04-10T13:05:25.7335707Z E at > org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:445) > 2020-04-10T13:05:25.7336527Z E at > org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:92) > 2020-04-10T13:05:25.7337401Z E at > org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:441) > 2020-04-10T13:05:25.7337953Z E at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:462) > 2020-04-10T13:05:25.7338492Z E at > org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:718) > 2020-04-10T13:05:25.7338973Z E at > org.apache.flink.runtime.taskmanager.Task.run(Task.java:542) > 2020-04-10T13:05:25.7339424Z E at > java.lang.Thread.run(Thread.java:748) > 2020-04-10T13:05:25.7340098Z E Caused by: > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: > java.lang.IllegalStateException: Process died with exit code 0 > 2020-04-10T13:05:25.7340908Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050) > 2020-04-10T13:05:25.7341593Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952) > 2020-04-10T13:05:25.7342253Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974) > 2020-04-10T13:05:25.7342958Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958) > 2020-04-10T13:05:25.7343715Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964) > 2020-04-10T13:05:25.7344482Z E at > org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:331) > 2020-04-10T13:05:25.7345342Z E at > org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:320) > 2020-04-10T13:05:25.7346088Z E at > org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:250) > 2020-04-10T13:05:25.7346778Z E at > org.apache.flink.python.AbstractPythonFunctionRunner.createStageBundleFactory(AbstractPythonFunctionRunner.java:195) > 2020-04-10T13:05:25.7347380Z E ... 16 more > 2020-04-10T13:05:25.7347751Z E Caused by: > java.lang.IllegalStateException: Process died with exit code 0 > 2020-04-10T13:05:25.7348374Z E at > org.apache.beam.runners.fnexecution.environment.ProcessManager$RunningProcess.isAliveOrThrow(ProcessManager.java:72) > 2020-04-10T13:05:25.7349245Z E at > org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.createEnvironment(ProcessEnvironmentFactory.java:137) > 2020-04-10T13:05:25.7349959Z E at > org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:200) > 2020-04-10T13:05:25.7350650Z E at > org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:184) > 2020-04-10T13:05:25.7351381Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528) > 2020-04-10T13:05:25.7352105Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277) > 2020-04-10T13:05:25.7352834Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154) > 2020-04-10T13:05:25.7353547Z E at > org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044) > 2020-04-10T13:05:25.7354141Z E ... 24 more > 2020-04-10T13:05:25.7354303Z > 2020-04-10T13:05:25.7354844Z > .tox/py37/lib/python3.7/site-packages/py4j/protocol.py:328: > Py4JJavaError{code} -- This message was sent by Atlassian Jira (v8.3.4#803005)