[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-18 Thread Thomas Graves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14548248#comment-14548248
 ] 

Thomas Graves commented on SPARK-7110:
--

Are you using spark1.1.0 as reported in the jira?

If so then this is probably issue 
https://issues.apache.org/jira/browse/SPARK-3778 which was fixed in spark 1.3.  
Can you try the newer version?  Otherwise you could try patching 1.1.

Its calling into  org.apache.spark.rdd.NewHadoopRDD.getPartitions, which ends 
up only calling into org.apache.hadoop.fs.FileSystem.addDelegationTokens if the 
tokens aren't already present.  Since that is a NewHadoopRDD instance it should 
have already populated them at that point.  That is why I'm thinking SPARK-3778 
might be the issue.
  
Due you have a snippet of code where you are creating the NewHadoopRDD?  Are 
you using newAPIHadoopFile or newAPIHadoopRDD for instance.

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi
Assignee: Sean Owen

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-18 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14549707#comment-14549707
 ] 

gu-chi commented on SPARK-7110:
---

From the description and modification of SPARK-3778, I think it should work, 
will give a try

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi
Assignee: Sean Owen

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-17 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14547098#comment-14547098
 ] 

gu-chi commented on SPARK-7110:
---

sorry, was busy these days
Actually, I tried to reproduce this issue for long time, but none attempt 
succeed. Fortunately, my colleague had a environment that can reproduce, but 
that is private code, only I now is using saveAsNewAPIHadoopFile directly.
here is full stack
2015-04-09 16:42:17,908 [sparkDriver-akka.actor.default-dispatcher-16] WARN  
org.apache.spark.scheduler.DAGScheduler - Creating new stage failed due to 
exception - job: 6
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
can be issued only with kerberos or web authentication
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6362)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:478)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:912)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1612)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy51.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:862)
at sun.reflect.GeneratedMethodAccessor73.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy52.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:948)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1377)
at 
org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:527)
at 
org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:505)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:242)
at 
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:217)
at 
com.huawei.dpa.calculate.basecomp.api.input.KeyNullInputFormat.getSplits(KeyNullInputFormat.java:45)
at 
org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:98)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
at org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
at 
org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:30)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at 

[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-17 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14547099#comment-14547099
 ] 

gu-chi commented on SPARK-7110:
---

sorry, was busy these days
Actually, I tried to reproduce this issue for long time, but none attempt 
succeed. Fortunately, my colleague had a environment that can reproduce, but 
that is private code, only I now is using saveAsNewAPIHadoopFile directly.
here is full stack
2015-04-09 16:42:17,908 [sparkDriver-akka.actor.default-dispatcher-16] WARN  
org.apache.spark.scheduler.DAGScheduler - Creating new stage failed due to 
exception - job: 6
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
can be issued only with kerberos or web authentication
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6362)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:478)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:912)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1612)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy51.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:862)
at sun.reflect.GeneratedMethodAccessor73.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy52.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:948)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1377)
at 
org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:527)
at 
org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:505)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:242)
at 
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:217)
at 
com.huawei.dpa.calculate.basecomp.api.input.KeyNullInputFormat.getSplits(KeyNullInputFormat.java:45)
at 
org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:98)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
at org.apache.spark.rdd.FilteredRDD.getPartitions(FilteredRDD.scala:29)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
at 
org.apache.spark.rdd.FlatMappedRDD.getPartitions(FlatMappedRDD.scala:30)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
at scala.Option.getOrElse(Option.scala:120)
at 

[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-11 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14537839#comment-14537839
 ] 

Apache Spark commented on SPARK-7110:
-

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/6052

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-11 Thread Thomas Graves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14537964#comment-14537964
 ] 

Thomas Graves commented on SPARK-7110:
--

[~gu chi]  Any chance you can provide the information I asked about?

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi
Assignee: Sean Owen

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-08 Thread Thomas Graves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14534494#comment-14534494
 ] 

Thomas Graves commented on SPARK-7110:
--

[~gu chi]  is there some of the stack trace missing from the description?  If 
so could you attach the rest of it?

Could you also provide the context in which you are NewHadoopRDD.getPartitions 
is called? Are you calling it directly or is it being called from another Spark 
routine? (if so which interface)



 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-05-07 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14533755#comment-14533755
 ] 

gu-chi commented on SPARK-7110:
---

As I analyzed the exception stack trace again, I found the exception was thown 
at NewHadoopRDD.getPartitions, when invoke inputFormat.getSplits, so I add the 
similar modification as SPARK-1203 there, as running for so long time, never 
occur again.

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-04-24 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14510900#comment-14510900
 ] 

Sean Owen commented on SPARK-7110:
--

I think you may need to apply the same fix to {{saveAsNewHadoopDatatset}}? Open 
a PR and have Thomas Graves look at it.

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-04-24 Thread Thomas Graves (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14511007#comment-14511007
 ] 

Thomas Graves commented on SPARK-7110:
--

So with the NewApi's, the call to: val job = new NewAPIHadoopJob(hadoopConf)
automatically adds credentials for you. Atleast normally it does.  What version 
of Hadoop are  you using?

So this sometimes works, sometimes doesn't?  Is it similar to what is described 
in SPARK-1203?  I think all I did to reproduce that was in spark-shell run a 
bunch of stuff, then before doing the saveAs I just waited a while.  Waiting 
basically allows the hadoop Filesystems to close and then when you go to re 
open it doesn't have the necessary credentials.  I think it was only a few 
minutes. 

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-04-23 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14510420#comment-14510420
 ] 

gu-chi commented on SPARK-7110:
---

exception trace stack as below:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
can be issued only with kerberos or web authentication
at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDelegationToken(FSNamesystem.java:6362)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getDelegationToken(NameNodeRpcServer.java:478)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getDelegationToken(ClientNamenodeProtocolServerSideTranslatorPB.java:912)
at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1612)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy51.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:862)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy52.getDelegationToken(Unknown Source)
at 
org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:948)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1377)
at 
org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:527)
at 
org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:505)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
at 
org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
at 
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:242)
at 
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:217)

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7110) when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be issued only with kerberos or web authentication

2015-04-23 Thread gu-chi (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14510422#comment-14510422
 ] 

gu-chi commented on SPARK-7110:
---

As I searched the history patch, SPARK-1203 is showing the same trace stack, 
but it was using saveAsTextFile method.

 when use saveAsNewAPIHadoopFile, sometimes it throws Delegation Token can be 
 issued only with kerberos or web authentication
 --

 Key: SPARK-7110
 URL: https://issues.apache.org/jira/browse/SPARK-7110
 Project: Spark
  Issue Type: Bug
  Components: YARN
Affects Versions: 1.1.0
Reporter: gu-chi

 Under yarn-client mode, this issue random occurs. Authentication method is 
 set to kerberos, and use saveAsNewAPIHadoopFile in PairRDDFunctions to save 
 data to HDFS, then exception comes as:
 org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
 can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org