[ 
https://issues.apache.org/jira/browse/OOZIE-2871?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16210735#comment-16210735
 ] 

Denes Bodo commented on OOZIE-2871:
-----------------------------------

[~asasvari] A very similar issue can be reproducible between two secure 
clusters with the same realm. In that case using hadoop distcp command from 
command line works as expected, but when we ask Oozie it fails with the above 
error (and stack trace). Oozie version is 4.2.

> when Enable Kerberos,  Oozie perform tasks throw “Client cannot authenticate 
> via:[TOKEN, KERBEROS]”
> ---------------------------------------------------------------------------------------------------
>
>                 Key: OOZIE-2871
>                 URL: https://issues.apache.org/jira/browse/OOZIE-2871
>             Project: Oozie
>          Issue Type: Bug
>          Components: security
>    Affects Versions: 4.2.0
>         Environment: Oozie version :4.2.0
> Hadoop version:2.7.2
> Both Oozie and Hadoop are enabled kerberos.
>            Reporter: yangfang
>            Priority: Critical
>
> When  Oozie and Hadoop both enabled kerberos, I submitted a mapreduce   job 
> to oozie,then I got the error as below:
> 2017-04-27 13:37:12,677 WARN MapReduceActionExecutor: 523 - SERVER[zdh143] 
> USER[mr] GROUP[-] TOKEN[] APP[map-reduce-wf] 
> JOB[0000008-170427133546167-oozie-mr-W] 
> ACTION[0000008-170427133546167-oozie-mr-W@mr-node] Launcher exception: Failed 
> on local exception: java.io.IOException: 
> org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
> via:[TOKEN, KERBEROS]; Host Details : local host is: "zdh142/10.43.183.142"; 
> destination host is: "zdh143":9000; 
> java.io.IOException: Failed on local exception: java.io.IOException: 
> org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
> via:[TOKEN, KERBEROS]; Host Details : local host is: "zdh142/10.43.183.142"; 
> destination host is: "zdh143":9000; 
>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:773)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1479)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1412)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
>       at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
>       at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
>       at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>       at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>       at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
>       at 
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>       at 
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1301)
>       at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
>       at 
> org.apache.hadoop.mapred.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:130)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:268)
>       at 
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1299)
>       at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1296)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>       at org.apache.hadoop.mapreduce.Job.submit(Job.java:1296)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:575)
>       at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:570)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>       at 
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:570)
>       at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:561)
>       at 
> org.apache.oozie.action.hadoop.MapReduceMain.submitJob(MapReduceMain.java:102)
>       at 
> org.apache.oozie.action.hadoop.MapReduceMain.run(MapReduceMain.java:64)
>       at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
>       at 
> org.apache.oozie.action.hadoop.MapReduceMain.main(MapReduceMain.java:38)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:238)
>       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.io.IOException: 
> org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
> via:[TOKEN, KERBEROS]
>       at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:687)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>       at 
> org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:650)
>       at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:737)
>       at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
>       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
>       at org.apache.hadoop.ipc.Client.call(Client.java:1451)
>       ... 49 more
> Caused by: org.apache.hadoop.security.AccessControlException: Client cannot 
> authenticate via:[TOKEN, KERBEROS]
>       at 
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:172)
>       at 
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
>       at 
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:560)
>       at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:375)
>       at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:729)
>       at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:725)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>       at 
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
>       ... 52 more



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to