Hi Ragini,
this is a dependency version issue. Flink 1.8.x does not support Hadoop 3,
yet. The support for Apache Hadoop 3.x was added in Flink 1.11 [1] through
FLINK-11086 [2]. You would need to upgrade to a more recent Flink version.

Best,
Matthias

[1]
https://flink.apache.org/news/2020/07/06/release-1.11.0.html#important-changes
[2] https://issues.apache.org/jira/browse/FLINK-11086

On Mon, May 3, 2021 at 3:05 PM Ragini Manjaiah <ragini.manja...@gmail.com>
wrote:

> Hi Team,
> I have Flink 1.8.1 and  hadoop open source 3.2.0 . My flink jobs run
> without issues on HDP 2.5.3 version. when run on hadoop open source 3.2.0
> encountering the below mentioned exception .
> I have set hadoop
> export HADOOP_CONF_DIR=/etc/hadoop/conf
> export HADOOP_CLASSPATH=`hadoop classpath`
>
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/home_dir/svsap61/flink-1.8.1/lib/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/usr/share/hadoop-tgt-3.2.0.1.0.0.11/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> java.lang.IllegalAccessError: tried to access method
> org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider.getProxyInternal()Ljava/lang/Object;
> from class
> org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider
>
> at
> org.apache.hadoop.yarn.client.RequestHedgingRMFailoverProxyProvider.init(RequestHedgingRMFailoverProxyProvider.java:75)
>
> at
> org.apache.hadoop.yarn.client.RMProxy.createRMFailoverProxyProvider(RMProxy.java:188)
>
> at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:118)
>
> at org.apache.hadoop.yarn.client.RMProxy.createRMProxy(RMProxy.java:93)
>
> at
> org.apache.hadoop.yarn.client.ClientRMProxy.createRMProxy(ClientRMProxy.java:72)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceStart(YarnClientImpl.java:195)
>
> at
> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:1013)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:274)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:454)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:97)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:224)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
>

Reply via email to