[jira] [Assigned] (SPARK-18840) HDFSCredentialProvider throws exception in non-HDFS security environment
[ https://issues.apache.org/jira/browse/SPARK-18840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-18840: Assignee: (was: Apache Spark) > HDFSCredentialProvider throws exception in non-HDFS security environment > > > Key: SPARK-18840 > URL: https://issues.apache.org/jira/browse/SPARK-18840 > Project: Spark > Issue Type: Bug > Components: YARN >Affects Versions: 1.6.3, 2.1.0 >Reporter: Saisai Shao >Priority: Minor > > Current in {{HDFSCredentialProvider}}, the code logic assumes HDFS delegation > token should be existed, this is ok for HDFS environment, but for some cloud > environment like Azure, HDFS is not required, so it will throw exception: > {code} > java.util.NoSuchElementException: head of empty list > at scala.collection.immutable.Nil$.head(List.scala:337) > at scala.collection.immutable.Nil$.head(List.scala:334) > at > org.apache.spark.deploy.yarn.Client.getTokenRenewalInterval(Client.scala:627) > {code} > We should also consider this situation. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-18840) HDFSCredentialProvider throws exception in non-HDFS security environment
[ https://issues.apache.org/jira/browse/SPARK-18840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-18840: Assignee: Apache Spark > HDFSCredentialProvider throws exception in non-HDFS security environment > > > Key: SPARK-18840 > URL: https://issues.apache.org/jira/browse/SPARK-18840 > Project: Spark > Issue Type: Bug > Components: YARN >Affects Versions: 1.6.3, 2.1.0 >Reporter: Saisai Shao >Assignee: Apache Spark >Priority: Minor > > Current in {{HDFSCredentialProvider}}, the code logic assumes HDFS delegation > token should be existed, this is ok for HDFS environment, but for some cloud > environment like Azure, HDFS is not required, so it will throw exception: > {code} > java.util.NoSuchElementException: head of empty list > at scala.collection.immutable.Nil$.head(List.scala:337) > at scala.collection.immutable.Nil$.head(List.scala:334) > at > org.apache.spark.deploy.yarn.Client.getTokenRenewalInterval(Client.scala:627) > {code} > We should also consider this situation. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org