Thanks for the report, Yidan.

It will be fixed in FLINK-23024 and hopefully fixed in 1.13.2.

Best,
Yangze Guo

On Fri, Jun 18, 2021 at 10:00 AM yidan zhao <hinobl...@gmail.com> wrote:
>
>  Yeah, I also think it is a bug.
>
> Arvid Heise <ar...@apache.org> 于2021年6月17日周四 下午10:13写道:
> >
> > Hi Yidan,
> >
> > could you check if the bucket exist and is accessible? Seems like this 
> > directory cannot be created 
> > bos://flink-bucket/flink/ha/opera_upd_FlinkTestJob3/blob.
> >
> > The second issue looks like a bug. I will create a ticket.
> >
> > On Wed, Jun 16, 2021 at 5:21 AM yidan zhao <hinobl...@gmail.com> wrote:
> >>
> >> does anyone has idea? Here I give another exception stack.
> >>
> >>
> >> Unhandled exception.
> >> org.apache.flink.runtime.rpc.akka.exceptions.AkkaRpcException: Failed
> >> to serialize the result for RPC call : requestTaskManagerDetailsInfo.
> >> at 
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.serializeRemoteResultAndVerifySize(AkkaRpcActor.java:404)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$sendAsyncResponse$0(AkkaRpcActor.java:360)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
> >> ~[?:1.8.0_251] at
> >> java.util.concurrent.CompletableFuture.uniHandleStage(CompletableFuture.java:848)
> >> ~[?:1.8.0_251] at
> >> java.util.concurrent.CompletableFuture.handle(CompletableFuture.java:2168)
> >> ~[?:1.8.0_251] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.sendAsyncResponse(AkkaRpcActor.java:352)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:319)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.actor.Actor$class.aroundReceive(Actor.scala:517)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.actor.ActorCell.invoke(ActorCell.scala:561)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.Mailbox.run(Mailbox.scala:225)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> >> [flink-dist_2.11-1.13.1.jar:1.13.1] Caused by:
> >> java.io.NotSerializableException:
> >> org.apache.flink.runtime.resourcemanager.TaskManagerInfoWithSlots at
> >> java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
> >> ~[?:1.8.0_251] at
> >> java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
> >> ~[?:1.8.0_251] at
> >> org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:624)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcSerializedValue.valueOf(AkkaRpcSerializedValue.java:66)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] at
> >> org.apache.flink.runtime.rpc.akka.AkkaRpcActor.serializeRemoteResultAndVerifySize(AkkaRpcActor.java:387)
> >> ~[flink-dist_2.11-1.13.1.jar:1.13.1] ... 27 more
> >>
> >> yidan zhao <hinobl...@gmail.com> 于2021年6月11日周五 下午4:10写道:
> >> >
> >> > I upgrade flink from 1.12 to 1.13.1, and the rest api
> >> > (http://xxx:8600/#/task-manager/xxx:34575-c53c6c/metrics) failed.
> >> > My standalone cluster include 30 Jobmanagers and 30 Taskmanagers, and
> >> > I found the api only works in the one jobmanager when it is the rest
> >> > api leader.
> >> >
> >> > for example, jobmanager1(http://jobmanager1:8600/#/...)  and
> >> > jobmanager2(http://jobmanager2:8600/#/...)。The overview works all, but
> >> > the taskmanager detail page has this issue.
> >> >
> >> > Here is the error log:
> >> >
> >> >
> >> > 2021-06-10 13:00:27,395 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --------------------------------------------------------------------------------
> >> > 2021-06-10 13:00:27,399 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Preconfiguration:
> >> > 2021-06-10 13:00:27,400 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> >
> >> >
> >> > RESOURCE_PARAMS extraction logs:
> >> > jvm_params: -Xmx2093796552 -Xms2093796552 -XX:MaxMetaspaceSize=536870912
> >> > dynamic_configs: -D jobmanager.memory.off-heap.size=268435456b -D
> >> > jobmanager.memory.jvm-overhead.min=322122552b -D
> >> > jobmanager.memory.jvm-metaspace.size=536870912b -D
> >> > jobmanager.memory.heap.size=2093796552b -D
> >> > jobmanager.memory.jvm-overhead.max=322122552b
> >> > logs: INFO  [] - Loading configuration property:
> >> > taskmanager.numberOfTaskSlots, 20
> >> > INFO  [] - Loading configuration property: 
> >> > cluster.evenly-spread-out-slots, true
> >> > INFO  [] - Loading configuration property: parallelism.default, 1
> >> > INFO  [] - Loading configuration property: 
> >> > jobmanager.memory.process.size, 3gb
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.memory.jvm-metaspace.size, 512mb
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.memory.jvm-overhead.fraction, 0.1
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.memory.jvm-overhead.min, 192mb
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.memory.jvm-overhead.max, 512mb
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.memory.off-heap.size, 256mb
> >> > INFO  [] - Loading configuration property: 
> >> > taskmanager.memory.process.size, 20gb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.jvm-metaspace.size, 512mb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.jvm-overhead.fraction, 0.1
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.jvm-overhead.min, 192mb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.jvm-overhead.max, 512mb
> >> > INFO  [] - Loading configuration property: 
> >> > taskmanager.memory.segment-size, 32kb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.managed.fraction, 0.4
> >> > INFO  [] - Loading configuration property: 
> >> > taskmanager.memory.managed.size, 64mb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.network.fraction, 0.1
> >> > INFO  [] - Loading configuration property: 
> >> > taskmanager.memory.network.min, 1gb
> >> > INFO  [] - Loading configuration property: 
> >> > taskmanager.memory.network.max, 2gb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.framework.off-heap.size, 256mb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.task.off-heap.size, 512mb
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.memory.framework.heap.size, 256mb
> >> > INFO  [] - Loading configuration property: high-availability, zookeeper
> >> > INFO  [] - Loading configuration property:
> >> > high-availability.storageDir, bos://flink-bucket/flink/ha
> >> > INFO  [] - Loading configuration property:
> >> > high-availability.zookeeper.quorum,
> >> > bjhw-aisecurity-cassandra01.bjhw:9681,bjhw-aisecurity-cassandra02.bjhw:9681,bjhw-aisecurity-cassandra03.bjhw:9681,bjhw-aisecurity-cassandra04.bjhw:9681,bjhw-aisecurity-cassandra05.bjhw:9681
> >> > INFO  [] - Loading configuration property:
> >> > high-availability.zookeeper.path.root, /flink
> >> > INFO  [] - Loading configuration property:
> >> > high-availability.cluster-id, opera_upd_FlinkTestJob3
> >> > INFO  [] - Loading configuration property: web.checkpoints.history, 100
> >> > INFO  [] - Loading configuration property: 
> >> > state.checkpoints.num-retained, 100
> >> > INFO  [] - Loading configuration property: state.checkpoints.dir,
> >> > bos://flink-bucket/flink/default-checkpoints
> >> > INFO  [] - Loading configuration property: state.savepoints.dir,
> >> > bos://flink-bucket/flink/default-savepoints
> >> > INFO  [] - Loading configuration property:
> >> > jobmanager.execution.failover-strategy, region
> >> > INFO  [] - Loading configuration property: web.submit.enable, false
> >> > INFO  [] - Loading configuration property: jobmanager.archive.fs.dir,
> >> > bos://flink-bucket/flink/completed-jobs/opera_upd_FlinkTestJob3
> >> > INFO  [] - Loading configuration property:
> >> > historyserver.archive.fs.dir,
> >> > bos://flink-bucket/flink/completed-jobs/opera_upd_FlinkTestJob3
> >> > INFO  [] - Loading configuration property:
> >> > historyserver.archive.fs.refresh-interval, 10000
> >> > INFO  [] - Loading configuration property: rest.port, 8600
> >> > INFO  [] - Loading configuration property: historyserver.web.port, 8700
> >> > INFO  [] - Loading configuration property:
> >> > high-availability.jobmanager.port, 9318
> >> > INFO  [] - Loading configuration property: blob.server.port, 9320
> >> > INFO  [] - Loading configuration property: taskmanager.rpc.port, 9319
> >> > INFO  [] - Loading configuration property: taskmanager.data.port, 9325
> >> > INFO  [] - Loading configuration property:
> >> > metrics.internal.query-service.port, 9321,9322
> >> > INFO  [] - Loading configuration property: akka.ask.timeout, 60s
> >> > INFO  [] - Loading configuration property:
> >> > taskmanager.network.request-backoff.max, 60000
> >> > INFO  [] - Loading configuration property: akka.framesize, 104857600b
> >> > INFO  [] - Loading configuration property: env.java.home,
> >> > /home/work/antibotFlink/java8
> >> > INFO  [] - Loading configuration property: env.pid.dir,
> >> > /home/work/antibotFlink/flink-1.13.1-sm
> >> > INFO  [] - Loading configuration property: io.tmp.dirs,
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp
> >> > INFO  [] - Loading configuration property: web.tmpdir,
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp
> >> > INFO  [] - Loading configuration property: env.java.opts,
> >> > "-XX:+UseG1GC -XX:-OmitStackTraceInFastThrow"
> >> > INFO  [] - Final Master Memory configuration:
> >> > INFO  [] -   Total Process Memory: 3.000gb (3221225472 bytes)
> >> > INFO  [] -     Total Flink Memory: 2.200gb (2362232008 bytes)
> >> > INFO  [] -       JVM Heap:         1.950gb (2093796552 bytes)
> >> > INFO  [] -       Off-heap:         256.000mb (268435456 bytes)
> >> > INFO  [] -     JVM Metaspace:      512.000mb (536870912 bytes)
> >> > INFO  [] -     JVM Overhead:       307.200mb (322122552 bytes)
> >> >
> >> > 2021-06-10 13:00:27,400 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --------------------------------------------------------------------------------
> >> > 2021-06-10 13:00:27,400 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Starting StandaloneSessionClusterEntrypoint (Version: 1.13.1, Scala:
> >> > 2.11, Rev:a7f3192, Date:2021-05-25T12:02:11+02:00)
> >> > 2021-06-10 13:00:27,401 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -  OS
> >> > current user: work
> >> > 2021-06-10 13:00:27,753 WARN  org.apache.hadoop.util.NativeCodeLoader
> >> >                     [] - Unable to load native-hadoop library for your
> >> > platform... using builtin-java classes where applicable
> >> > 2021-06-10 13:00:27,847 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Current Hadoop/Kerberos user: work
> >> > 2021-06-10 13:00:27,847 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > JVM: Java HotSpot(TM) 64-Bit Server VM - Oracle Corporation -
> >> > 1.8/25.251-b08
> >> > 2021-06-10 13:00:27,847 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Maximum heap size: 1998 MiBytes
> >> > 2021-06-10 13:00:27,847 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > JAVA_HOME: /home/work/antibotFlink/java8
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Hadoop version: 2.7.5
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -  JVM
> >> > Options:
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Xmx2093796552
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Xms2093796552
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -XX:MaxMetaspaceSize=536870912
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -XX:+UseG1GC
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -XX:-OmitStackTraceInFastThrow
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Dlog.file=/home/work/antibotFlink/flink-1.13.1-sm/log/flink-work-standalonesession-0-nj03-ecom-adapp-m12-39.nj03.baidu.com.log
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Dlog4j.configuration=file:/home/work/antibotFlink/flink-1.13.1-sm/conf/log4j.properties
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Dlog4j.configurationFile=file:/home/work/antibotFlink/flink-1.13.1-sm/conf/log4j.properties
> >> > 2021-06-10 13:00:27,850 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -Dlogback.configurationFile=file:/home/work/antibotFlink/flink-1.13.1-sm/conf/logback.xml
> >> > 2021-06-10 13:00:27,851 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Program Arguments:
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --configDir
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > /home/work/antibotFlink/flink-1.13.1-sm/conf
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --executionMode
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > cluster
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --host
> >> > 2021-06-10 13:00:27,852 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > nj03-ecom-adapp-m12-39.nj03.baidu.com
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -D
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > jobmanager.memory.off-heap.size=268435456b
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -D
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > jobmanager.memory.jvm-overhead.min=322122552b
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -D
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > jobmanager.memory.jvm-metaspace.size=536870912b
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -D
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > jobmanager.memory.heap.size=2093796552b
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > -D
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > jobmanager.memory.jvm-overhead.max=322122552b
> >> > 2021-06-10 13:00:27,853 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Classpath: 
> >> > /home/work/antibotFlink/flink-1.13.1-sm/lib/flink-csv-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-json-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-shaded-zookeeper-3.4.14.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-table_2.11-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-table-blink_2.11-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-1.2-api-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-api-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-core-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-slf4j-impl-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-dist_2.11-1.13.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/etc/hadoop:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/avro-1.7.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/activation-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-client-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-collections-3.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/joda-time-2.10.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/junit-4.11.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-core-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/httpcore-4.4.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/httpclient-4.5.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-databind-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-json-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jettison-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/paranamer-2.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/xmlenc-0.52.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-net-3.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsp-api-2.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-digester-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/gson-2.2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-annotations-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/bce-java-sdk-0.10.82.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hadoop-auth-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsch-0.1.54.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-framework-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hadoop-annotations-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-nfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-common-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/libdfs-java-2.0.5-support-community.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/activation-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/javax.inject-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jettison-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guice-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-registry-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-client-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-api-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.5.jar:/contrib/capacity-scheduler/*.jar::
> >> > 2021-06-10 13:00:27,856 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > --------------------------------------------------------------------------------
> >> > 2021-06-10 13:00:27,857 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Registered UNIX signal handlers for [TERM, HUP, INT]
> >> > 2021-06-10 13:00:27,874 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.numberOfTaskSlots, 20
> >> > 2021-06-10 13:00:27,874 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: cluster.evenly-spread-out-slots, true
> >> > 2021-06-10 13:00:27,874 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: parallelism.default, 1
> >> > 2021-06-10 13:00:27,874 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.memory.process.size, 3gb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.memory.jvm-metaspace.size,
> >> > 512mb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > jobmanager.memory.jvm-overhead.fraction, 0.1
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.memory.jvm-overhead.min,
> >> > 192mb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.memory.jvm-overhead.max,
> >> > 512mb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.memory.off-heap.size, 256mb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.process.size, 20gb
> >> > 2021-06-10 13:00:27,875 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.jvm-metaspace.size,
> >> > 512mb
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > taskmanager.memory.jvm-overhead.fraction, 0.1
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.jvm-overhead.min,
> >> > 192mb
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.jvm-overhead.max,
> >> > 512mb
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.segment-size, 32kb
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.managed.fraction,
> >> > 0.4
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.managed.size, 64mb
> >> > 2021-06-10 13:00:27,876 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.network.fraction,
> >> > 0.1
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.network.min, 1gb
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.network.max, 2gb
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > taskmanager.memory.framework.off-heap.size, 256mb
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.memory.task.off-heap.size,
> >> > 512mb
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > taskmanager.memory.framework.heap.size, 256mb
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability, zookeeper
> >> > 2021-06-10 13:00:27,877 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability.storageDir,
> >> > bos://flink-bucket/flink/ha
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability.zookeeper.quorum,
> >> > bjhw-aisecurity-cassandra01.bjhw:9681,bjhw-aisecurity-cassandra02.bjhw:9681,bjhw-aisecurity-cassandra03.bjhw:9681,bjhw-aisecurity-cassandra04.bjhw:9681,bjhw-aisecurity-cassandra05.bjhw:9681
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability.zookeeper.path.root,
> >> > /flink
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability.cluster-id,
> >> > opera_upd_FlinkTestJob3
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: web.checkpoints.history, 100
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: state.checkpoints.num-retained, 100
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: state.checkpoints.dir,
> >> > bos://flink-bucket/flink/default-checkpoints
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: state.savepoints.dir,
> >> > bos://flink-bucket/flink/default-savepoints
> >> > 2021-06-10 13:00:27,878 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > jobmanager.execution.failover-strategy, region
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: web.submit.enable, false
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: jobmanager.archive.fs.dir,
> >> > bos://flink-bucket/flink/completed-jobs/opera_upd_FlinkTestJob3
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: historyserver.archive.fs.dir,
> >> > bos://flink-bucket/flink/completed-jobs/opera_upd_FlinkTestJob3
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > historyserver.archive.fs.refresh-interval, 10000
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: rest.port, 8600
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: historyserver.web.port, 8700
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: high-availability.jobmanager.port,
> >> > 9318
> >> > 2021-06-10 13:00:27,879 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: blob.server.port, 9320
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.rpc.port, 9319
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: taskmanager.data.port, 9325
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: metrics.internal.query-service.port,
> >> > 9321,9322
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: akka.ask.timeout, 60s
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property:
> >> > taskmanager.network.request-backoff.max, 60000
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: akka.framesize, 104857600b
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: env.java.home,
> >> > /home/work/antibotFlink/java8
> >> > 2021-06-10 13:00:27,880 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: env.pid.dir,
> >> > /home/work/antibotFlink/flink-1.13.1-sm
> >> > 2021-06-10 13:00:27,881 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: io.tmp.dirs,
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp
> >> > 2021-06-10 13:00:27,881 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: web.tmpdir,
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp
> >> > 2021-06-10 13:00:27,881 INFO
> >> > org.apache.flink.configuration.GlobalConfiguration           [] -
> >> > Loading configuration property: env.java.opts, "-XX:+UseG1GC
> >> > -XX:-OmitStackTraceInFastThrow"
> >> > 2021-06-10 13:00:27,907 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Starting StandaloneSessionClusterEntrypoint.
> >> > 2021-06-10 13:00:27,953 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Install default filesystem.
> >> > 2021-06-10 13:00:27,998 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Install security context.
> >> > 2021-06-10 13:00:28,016 WARN
> >> > org.apache.flink.runtime.util.HadoopUtils                    [] -
> >> > Could not find Hadoop configuration via any of the supported methods
> >> > (Flink configuration, environment variables).
> >> > 2021-06-10 13:00:28,059 INFO
> >> > org.apache.flink.runtime.security.modules.HadoopModule       [] -
> >> > Hadoop user set to work (auth:SIMPLE)
> >> > 2021-06-10 13:00:28,067 INFO
> >> > org.apache.flink.runtime.security.modules.JaasModule         [] - Jaas
> >> > file will be created as
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp/jaas-3056484369482264807.conf.
> >> > 2021-06-10 13:00:28,078 INFO
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint        [] -
> >> > Initializing cluster services.
> >> > 2021-06-10 13:00:28,108 INFO
> >> > org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils        [] -
> >> > Trying to start actor system, external address
> >> > nj03-ecom-adapp-m12-39.nj03.baidu.com:9318, bind address 0.0.0.0:9318.
> >> > 2021-06-10 13:00:28,716 INFO  akka.event.slf4j.Slf4jLogger
> >> >                     [] - Slf4jLogger started
> >> > 2021-06-10 13:00:28,746 INFO  akka.remote.Remoting
> >> >                     [] - Starting remoting
> >> > 2021-06-10 13:00:28,887 INFO  akka.remote.Remoting
> >> >                     [] - Remoting started; listening on addresses
> >> > :[akka.tcp://fl...@nj03-ecom-adapp-m12-39.nj03.baidu.com:9318]
> >> > 2021-06-10 13:00:29,022 INFO
> >> > org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils        [] -
> >> > Actor system started at
> >> > akka.tcp://fl...@nj03-ecom-adapp-m12-39.nj03.baidu.com:9318
> >> > 2021-06-10 13:00:29,047 WARN
> >> > org.apache.flink.runtime.util.HadoopUtils                    [] -
> >> > Could not find Hadoop configuration via any of the supported methods
> >> > (Flink configuration, environment variables).
> >> > 2021-06-10 13:00:29,728 INFO
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore            [] -
> >> > Creating highly available BLOB storage directory at
> >> > bos://flink-bucket/flink/ha/opera_upd_FlinkTestJob3/blob
> >> > 2021-06-10 13:00:30,023 INFO  com.baidubce.http.BceHttpClient
> >> >                     [] - Unable to execute HTTP request
> >> > com.baidubce.BceServiceException: Not Found (Status Code: 404; Error
> >> > Code: null; Request ID: d4f759eb-84d6-4c94-b22a-a5d9b7a63613)
> >> > at 
> >> > com.baidubce.http.handler.BceErrorResponseHandler.handle(BceErrorResponseHandler.java:59)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at com.baidubce.http.BceHttpClient.execute(BceHttpClient.java:243)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.AbstractBceClient.invokeHttpClient(AbstractBceClient.java:189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1171)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BosNativeFileSystemStore.retrieveMetadata(BosNativeFileSystemStore.java:531)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > ~[?:1.8.0_251]
> >> > at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at org.apache.hadoop.fs.bos.$Proxy22.retrieveMetadata(Unknown Source) 
> >> > ~[?:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.getFileStatus(BaiduBosFileSystem.java:242)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdir(BaiduBosFileSystem.java:393)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdirs(BaiduBosFileSystem.java:386)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore.<init>(FileSystemBlobStore.java:64)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:98)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at java.security.AccessController.doPrivileged(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
> >> > [hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > 2021-06-10 13:00:30,136 INFO  com.baidubce.http.BceHttpClient
> >> >                     [] - Unable to execute HTTP request
> >> > com.baidubce.BceServiceException: Not Found (Status Code: 404; Error
> >> > Code: null; Request ID: 1eb8c10f-d8c5-4280-a1fb-3363538b1b61)
> >> > at 
> >> > com.baidubce.http.handler.BceErrorResponseHandler.handle(BceErrorResponseHandler.java:59)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at com.baidubce.http.BceHttpClient.execute(BceHttpClient.java:243)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.AbstractBceClient.invokeHttpClient(AbstractBceClient.java:189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1171)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BosNativeFileSystemStore.retrieveMetadata(BosNativeFileSystemStore.java:531)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > ~[?:1.8.0_251]
> >> > at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at org.apache.hadoop.fs.bos.$Proxy22.retrieveMetadata(Unknown Source) 
> >> > ~[?:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.getFileStatus(BaiduBosFileSystem.java:242)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdir(BaiduBosFileSystem.java:393)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdirs(BaiduBosFileSystem.java:386)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore.<init>(FileSystemBlobStore.java:64)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:98)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at java.security.AccessController.doPrivileged(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
> >> > [hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > 2021-06-10 13:00:30,230 INFO  com.baidubce.http.BceHttpClient
> >> >                     [] - Unable to execute HTTP request
> >> > com.baidubce.BceServiceException: Not Found (Status Code: 404; Error
> >> > Code: null; Request ID: e33160b9-d214-4f7a-be06-5a1de7606813)
> >> > at 
> >> > com.baidubce.http.handler.BceErrorResponseHandler.handle(BceErrorResponseHandler.java:59)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at com.baidubce.http.BceHttpClient.execute(BceHttpClient.java:243)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.AbstractBceClient.invokeHttpClient(AbstractBceClient.java:189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1171)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BosNativeFileSystemStore.retrieveMetadata(BosNativeFileSystemStore.java:531)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > ~[?:1.8.0_251]
> >> > at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at org.apache.hadoop.fs.bos.$Proxy22.retrieveMetadata(Unknown Source) 
> >> > ~[?:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.getFileStatus(BaiduBosFileSystem.java:242)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdir(BaiduBosFileSystem.java:393)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdirs(BaiduBosFileSystem.java:386)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore.<init>(FileSystemBlobStore.java:64)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:98)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at java.security.AccessController.doPrivileged(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
> >> > [hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > 2021-06-10 13:00:30,324 INFO  com.baidubce.http.BceHttpClient
> >> >                     [] - Unable to execute HTTP request
> >> > com.baidubce.BceServiceException: Not Found (Status Code: 404; Error
> >> > Code: null; Request ID: 338e60a4-079e-4d4f-ab79-ba06727218e2)
> >> > at 
> >> > com.baidubce.http.handler.BceErrorResponseHandler.handle(BceErrorResponseHandler.java:59)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at com.baidubce.http.BceHttpClient.execute(BceHttpClient.java:243)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.AbstractBceClient.invokeHttpClient(AbstractBceClient.java:189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1171)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BosNativeFileSystemStore.retrieveMetadata(BosNativeFileSystemStore.java:531)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > ~[?:1.8.0_251]
> >> > at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at org.apache.hadoop.fs.bos.$Proxy22.retrieveMetadata(Unknown Source) 
> >> > ~[?:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.getFileStatus(BaiduBosFileSystem.java:242)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdir(BaiduBosFileSystem.java:393)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdirs(BaiduBosFileSystem.java:386)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore.<init>(FileSystemBlobStore.java:64)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:98)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at java.security.AccessController.doPrivileged(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
> >> > [hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > 2021-06-10 13:00:30,371 INFO  com.baidubce.http.BceHttpClient
> >> >                     [] - Unable to execute HTTP request
> >> > com.baidubce.BceServiceException: Not Found (Status Code: 404; Error
> >> > Code: null; Request ID: 9254cc39-e6f8-4cf2-a684-c102e61289cf)
> >> > at 
> >> > com.baidubce.http.handler.BceErrorResponseHandler.handle(BceErrorResponseHandler.java:59)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at com.baidubce.http.BceHttpClient.execute(BceHttpClient.java:243)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.AbstractBceClient.invokeHttpClient(AbstractBceClient.java:189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1189)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > com.baidubce.services.bos.BosClient.getObjectMetadata(BosClient.java:1171)
> >> > ~[bce-java-sdk-0.10.82.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BosNativeFileSystemStore.retrieveMetadata(BosNativeFileSystemStore.java:531)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > ~[?:1.8.0_251]
> >> > at 
> >> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > ~[?:1.8.0_251]
> >> > at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at org.apache.hadoop.fs.bos.$Proxy22.retrieveMetadata(Unknown Source) 
> >> > ~[?:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.getFileStatus(BaiduBosFileSystem.java:252)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdir(BaiduBosFileSystem.java:393)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at 
> >> > org.apache.hadoop.fs.bos.BaiduBosFileSystem.mkdirs(BaiduBosFileSystem.java:386)
> >> > ~[bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:?]
> >> > at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
> >> > ~[hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.FileSystemBlobStore.<init>(FileSystemBlobStore.java:64)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createFileSystemBlobStore(BlobUtils.java:98)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.blob.BlobUtils.createBlobStoreFromConfig(BlobUtils.java:76)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.highavailability.HighAvailabilityServicesUtils.createHighAvailabilityServices(HighAvailabilityServicesUtils.java:115)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.createHaServices(ClusterEntrypoint.java:353)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:311)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:239)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:189)
> >> > ~[flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at java.security.AccessController.doPrivileged(Native Method) 
> >> > ~[?:1.8.0_251]
> >> > at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_251]
> >> > at 
> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
> >> > [hadoop-common-2.7.5.jar:?]
> >> > at 
> >> > org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:186)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:600)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > at 
> >> > org.apache.flink.runtime.entrypoint.StandaloneSessionClusterEntrypoint.main(StandaloneSessionClusterEntrypoint.java:59)
> >> > [flink-dist_2.11-1.13.1.jar:1.13.1]
> >> > 2021-06-10 13:00:31,804 INFO
> >> > org.apache.flink.runtime.util.ZooKeeperUtils                 [] -
> >> > Enforcing default ACL for ZK connections
> >> > 2021-06-10 13:00:31,804 INFO
> >> > org.apache.flink.runtime.util.ZooKeeperUtils                 [] -
> >> > Using '/flink/opera_upd_FlinkTestJob3' as Zookeeper namespace.
> >> > 2021-06-10 13:00:31,853 INFO
> >> > org.apache.flink.shaded.curator4.org.apache.curator.utils.Compatibility
> >> > [] - Running in ZooKeeper 3.4.x compatibility mode
> >> > 2021-06-10 13:00:31,854 INFO
> >> > org.apache.flink.shaded.curator4.org.apache.curator.utils.Compatibility
> >> > [] - Using emulated InjectSessionExpiration
> >> > 2021-06-10 13:00:31,890 INFO
> >> > org.apache.flink.shaded.curator4.org.apache.curator.framework.imps.CuratorFrameworkImpl
> >> > [] - Starting
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client 
> >> > environment:zookeeper.version=3.4.14-4c25d480e66aadd371de8bd2fd8da255ac140bcf,
> >> > built on 03/06/2019 16:18 GMT
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:host.name=nj03-ecom-adapp-m12-39.nj03.baidu.com
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:java.version=1.8.0_251
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:java.vendor=Oracle Corporation
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:java.home=/home/work/antibotFlink/java8/jre
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client 
> >> > environment:java.class.path=/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-csv-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-json-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-shaded-zookeeper-3.4.14.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-table_2.11-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-table-blink_2.11-1.13.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-1.2-api-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-api-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-core-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/log4j-slf4j-impl-2.12.1.jar:/home/work/antibotFlink/flink-1.13.1-sm/lib/flink-dist_2.11-1.13.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/etc/hadoop:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/avro-1.7.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/activation-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-client-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-collections-3.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/joda-time-2.10.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/junit-4.11.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-core-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/httpcore-4.4.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/httpclient-4.5.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-databind-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-json-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jettison-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/paranamer-2.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/xmlenc-0.52.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-net-3.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsp-api-2.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/htrace-core-3.1.0-incubating.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-digester-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/gson-2.2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-annotations-2.10.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/bce-java-sdk-0.10.82.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hadoop-auth-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsch-0.1.54.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/curator-framework-2.7.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/lib/hadoop-annotations-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-nfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/common/hadoop-common-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/htrace-core-3.1.0-incubating.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/hadoop-hdfs-nfs-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/libdfs-java-2.0.5-support-community.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/hdfs/bos-hdfs-sdk-1.0.1-SNAPSHOT-0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/activation-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/javax.inject-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jettison-1.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/guice-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-registry-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-client-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-api-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-tests-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/lib/hadoop-annotations-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.5-tests.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.7.5.jar:/home/work/antibotFlink/hadoop-client-2.7.5/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.7.5.jar:/contrib/capacity-scheduler/*.jar::
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client 
> >> > environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
> >> > 2021-06-10 13:00:31,899 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:java.io.tmpdir=/tmp
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:java.compiler=<NA>
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:os.name=Linux
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:os.arch=amd64
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:os.version=3.10.0_3-0-0-20
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:user.name=work
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:user.home=/home/work
> >> > 2021-06-10 13:00:31,900 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Client environment:user.dir=/home/work/antibotFlink/flink-1.13.1-sm
> >> > 2021-06-10 13:00:31,901 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ZooKeeper [] -
> >> > Initiating client connection,
> >> > connectString=bjhw-aisecurity-cassandra01.bjhw:9681,bjhw-aisecurity-cassandra02.bjhw:9681,bjhw-aisecurity-cassandra03.bjhw:9681,bjhw-aisecurity-cassandra04.bjhw:9681,bjhw-aisecurity-cassandra05.bjhw:9681
> >> > sessionTimeout=60000
> >> > watcher=org.apache.flink.shaded.curator4.org.apache.curator.ConnectionState@5807efad
> >> > 2021-06-10 13:00:31,918 INFO
> >> > org.apache.flink.shaded.curator4.org.apache.curator.framework.imps.CuratorFrameworkImpl
> >> > [] - Default schema
> >> > 2021-06-10 13:00:31,919 WARN
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ClientCnxn []
> >> > - SASL configuration failed: javax.security.auth.login.LoginException:
> >> > No JAAS configuration section named 'Client' was found in specified
> >> > JAAS configuration file:
> >> > '/home/work/antibotFlink/flink-1.13.1-sm/tmp/jaas-3056484369482264807.conf'.
> >> > Will continue connection to Zookeeper server without SASL
> >> > authentication, if Zookeeper server allows it.
> >> > 2021-06-10 13:00:31,920 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ClientCnxn []
> >> > - Opening socket connection to server
> >> > bjhw-aisecurity-cassandra02.bjhw/10.227.136.169:9681
> >> > 2021-06-10 13:00:31,921 ERROR
> >> > org.apache.flink.shaded.curator4.org.apache.curator.ConnectionState []
> >> > - Authentication failed
> >> > 2021-06-10 13:00:31,924 INFO  org.apache.flink.runtime.blob.BlobServer
> >> >                     [] - Created BLOB server storage directory
> >> > /home/work/antibotFlink/flink-1.13.1-sm/tmp/blobStore-3e38f615-5992-4d98-8f35-f526f9251a6c
> >> > 2021-06-10 13:00:31,928 INFO  org.apache.flink.runtime.blob.BlobServer
> >> >                     [] - Started BLOB server at 0.0.0.0:9320 - max
> >> > concurrent requests: 50 - max backlog: 1000
> >> > 2021-06-10 13:00:31,941 INFO
> >> > org.apache.flink.runtime.metrics.MetricRegistryImpl          [] - No
> >> > metrics reporter configured, no metrics will be exposed/reported.
> >> > 2021-06-10 13:00:31,946 INFO
> >> > org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils        [] -
> >> > Trying to start actor system, external address
> >> > nj03-ecom-adapp-m12-39.nj03.baidu.com:9321, bind address 0.0.0.0:9321.
> >> > 2021-06-10 13:00:31,947 INFO
> >> > org.apache.flink.shaded.zookeeper3.org.apache.zookeeper.ClientCnxn []
> >> > - Socket connection established to
> >> > bjhw-aisecurity-cassandra02.bjhw/10.227.136.169:9681, initiating
> >> > session
> >> > 2021-06-10 13:00:31,964 INFO  akka.event.slf4j.Slf4jLogge
> >>
>

Reply via email to