[ 
https://issues.apache.org/jira/browse/SPARK-7110?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14548248#comment-14548248
 ] 

Thomas Graves commented on SPARK-7110:
--------------------------------------

Are you using spark1.1.0 as reported in the jira?

If so then this is probably issue 
https://issues.apache.org/jira/browse/SPARK-3778 which was fixed in spark 1.3.  
Can you try the newer version?  Otherwise you could try patching 1.1.

Its calling into  org.apache.spark.rdd.NewHadoopRDD.getPartitions, which ends 
up only calling into org.apache.hadoop.fs.FileSystem.addDelegationTokens if the 
tokens aren't already present.  Since that is a NewHadoopRDD instance it should 
have already populated them at that point.  That is why I'm thinking SPARK-3778 
might be the issue.
  
Due you have a snippet of code where you are creating the NewHadoopRDD?  Are 
you using newAPIHadoopFile or newAPIHadoopRDD for instance.

> when use saveAsNewAPIHadoopFile, sometimes it throws "Delegation Token can be 
> issued only with kerberos or web authentication"
> ------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7110
>                 URL: https://issues.apache.org/jira/browse/SPARK-7110
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.1.0
>            Reporter: gu-chi
>            Assignee: Sean Owen
>
> Under yarn-client mode, this issue random occurs. Authentication method is 
> set to kerberos, and use "saveAsNewAPIHadoopFile" in PairRDDFunctions to save 
> data to HDFS, then exception comes as:
> org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token 
> can be issued only with kerberos or web authentication



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to