Hello Yang,
I have attached my pom file and I did not see that I am using any Hadoop
dependency. Can you please help me.
On Wed, May 6, 2020 at 1:22 PM Yang Wang wrote:
> Hi aj,
>
> From the logs you have provided, the hadoop version is still 2.4.1.
> Could you check the user jar(i.e. events-processor-1.0-SNAPSHOT.jar) have
> some
> hadoop classes? If it is, you need to exclude the hadoop dependency.
>
>
> Best,
> Yang
>
> aj 于2020年5月6日周三 下午3:38写道:
>
>> Hello,
>>
>> Please help me upgrade to 1.10 in AWS EMR.
>>
>> On Fri, May 1, 2020 at 4:05 PM aj wrote:
>>
>>> Hi Yang,
>>>
>>> I am attaching the logs for your reference, please help me what i am
>>> doing wrong.
>>>
>>> Thanks,
>>> Anuj
>>>
>>> On Wed, Apr 29, 2020 at 9:06 AM Yang Wang wrote:
>>>
Hi Anuj,
I think the exception you come across still because the hadoop version
is 2.4.1. I have checked the hadoop code, the code line are exactly
same.
For 2.8.1, i also have checked the ruleParse. It could work.
/**
* A pattern for parsing a auth_to_local rule.
*/
private static final Pattern ruleParser =
Pattern.compile("\\s*((DEFAULT)|(RULE:\\[(\\d*):([^\\]]*)](\\(([^)]*)\\))?"+
"(s/([^/]*)/([^/]*)/(g)?)?))/?(L)?");
Could you share the jobmanager logs so that i could check the classpath
and hadoop version?
Best,
Yang
aj 于2020年4月28日周二 上午1:01写道:
> Hello Yang,
> My Hadoop version is Hadoop 3.2.1-amzn-0
> and I have put in flink/lib.
> flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
>
> then I am getting below error :
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/mnt/yarn/usercache/hadoop/appcache/application_1587983834922_0002/filecache/10/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> Exception in thread "main" java.lang.IllegalArgumentException: Invalid
> rule: /L
> RULE:[2:$1@$0](.*@)s/@.*///L
> DEFAULT
> at
> org.apache.hadoop.security.authentication.util.KerberosName.parseRules(KerberosName.java:321)
> at
> org.apache.hadoop.security.authentication.util.KerberosName.setRules(KerberosName.java:386)
> at
> org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:75)
> at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:247)
> at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
> at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
> at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
> at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
> at
> org.apache.flink.yarn.entrypoint.YarnEntrypointUtils.logYarnEnvironmentInformation(YarnEntrypointUtils.java:136)
> at
> org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint.main(YarnJobClusterEntrypoint.java:109)
>
>
> if I remove the flink-shaded-hadoop-2-uber-2.8.3-10.0.jar from lib
> then i get below error:
>
> 2020-04-27 16:59:37,293 INFO org.apache.flink.client.cli.CliFrontend
> - Classpath:
> /usr/lib/flink/lib/flink-table-blink_2.11-1.10.0.jar:/usr/lib/flink/lib/flink-table_2.11-1.10.0.jar:/usr/lib/flink/lib/log4j-1.2.17.jar:/usr/lib/flink/lib/slf4j-log4j12-1.7.15.jar:/usr/lib/flink/lib/flink-dist_2.11-1.10.0.jar::/etc/hadoop/conf:/etc/hadoop/conf
> 2020-04-27 16:59:37,293 INFO org.apache.flink.client.cli.CliFrontend
> -
>
> 2020-04-27 16:59:37,300 INFO
> org.apache.flink.configuration.GlobalConfiguration- Loading
> configuration property: jobmanager.heap.size, 1024m
> 2020-04-27 16:59:37,300 INFO
> org.apache.flink.configuration.GlobalConfiguration- Loading
> configuration property: taskmanager.memory.process.size, 1568m
> 2020-04-27 16:59:37,300 INFO
> org.apache.flink.configuration.GlobalConfiguration- Loading
> configuration property: taskmanager.numberOfTaskSlots, 1
> 2020-04-27 16:59:37,300 INFO
> org.apache.flink.configuration.GlobalConfiguration- Loading
> configuration property: parallelism.default, 1
> 2020-04-27 16:59:37,300 INFO
>