YuAngZhang created FLINK-24249:
----------------------------------

             Summary: login from keytab fail when disk damage
                 Key: FLINK-24249
                 URL: https://issues.apache.org/jira/browse/FLINK-24249
             Project: Flink
          Issue Type: Bug
          Components: Runtime / Checkpointing
    Affects Versions: 1.13.2
            Reporter: YuAngZhang


flink on yarn will localize user keytab on local machine disk, trigger 
checkpoint will fail when jobmanager mkdirs on hdfs when the disk damage,but 
the flink job not fail,so I can't recover from checkpoint

the exception like this
{code:java}
java.io.IOException: Failed on local exception: java.io.IOException: Login 
failure for joey from keytab /data01/yarn/nm/usercache/ 
joey/appcache/application_1631093653028_0015/container_e134_1631093653028_0015_01_000001/krb5.keytab;
 Host Details : local host is: "localhost/10.1.1.37"; destination host is: 
"localhost":8020;     at 
org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.call(Client.java:1474) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.call(Client.java:1401) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
com.sun.proxy.$Proxy41.mkdirs(Unknown Source) ~[?:?]    at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source) ~[?:?]    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_181]    at java.lang.reflect.Method.invoke(Method.java:498) 
~[?:1.8.0_181]    at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
com.sun.proxy.$Proxy42.mkdirs(Unknown Source) ~[?:?]    at 
org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2742) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2713) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1819) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:183)
 ~[flink-dist_2.11-1.13.2.jar:1.13.2]    at 
org.apache.flink.runtime.state.filesystem.FsCheckpointStorageAccess.initializeLocationForCheckpoint(FsCheckpointStorageAccess.java:129)
 ~[flink-dist_2.11-1.13.2.jar:1.13.2]    at 
org.apache.flink.runtime.checkpoint.CheckpointCoordinator.initializeCheckpoint(CheckpointCoordinator.java:689)
 ~[flink-dist_2.11-1.13.2.jar:1.13.2]    at 
org.apache.flink.runtime.checkpoint.CheckpointCoordinator.lambda$startTriggeringCheckpoint$2(CheckpointCoordinator.java:543)
 ~[flink-dist_2.11-1.13.2.jar:1.13.2]    at 
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:602) 
[?:1.8.0_181]    at 
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577)
 [?:1.8.0_181]    at 
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
 [?:1.8.0_181]    at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) 
[?:1.8.0_181]    at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
[?:1.8.0_181]    at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
 [?:1.8.0_181]    at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
 [?:1.8.0_181]    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
[?:1.8.0_181]    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
[?:1.8.0_181]    at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]Caused 
by: java.io.IOException: Login failure for joey from keytab 
/data01/yarn/nm/usercache/joey/appcache/application_1631093653028_0015/container_e134_1631093653028_0015_01_000001/krb5.keytab
    at 
org.apache.hadoop.security.UserGroupInformation.reloginFromKeytab(UserGroupInformation.java:1086)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:659) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]    at 
javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:645)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:732) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:370) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.getConnection(Client.java:1523) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.call(Client.java:1440) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    ... 32 moreCaused 
by: javax.security.auth.login.LoginException: Unable to obtain password from 
user
    at 
com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:897)
 ~[?:1.8.0_181]    at 
com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760)
 ~[?:1.8.0_181]    at 
com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) 
~[?:1.8.0_181]    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method) ~[?:1.8.0_181]    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_181]    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_181]    at java.lang.reflect.Method.invoke(Method.java:498) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) 
~[?:1.8.0_181]    at java.security.AccessController.doPrivileged(Native Method) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) 
~[?:1.8.0_181]    at 
javax.security.auth.login.LoginContext.login(LoginContext.java:587) 
~[?:1.8.0_181]    at 
org.apache.hadoop.security.UserGroupInformation.reloginFromKeytab(UserGroupInformation.java:1078)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:659) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]    at 
javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:645)
 ~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:732) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:370) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.getConnection(Client.java:1523) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    at 
org.apache.hadoop.ipc.Client.call(Client.java:1440) 
~[flink-shaded-hadoop-2-uber-2.6.5-10.0.jar:2.6.5-10.0]    ... 32 more
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to