[ https://issues.apache.org/jira/browse/SPARK-31941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129074#comment-17129074 ]
Apache Spark commented on SPARK-31941: -------------------------------------- User 'SaurabhChawla100' has created a pull request for this issue: https://github.com/apache/spark/pull/28768 > Handling the exception in SparkUI for getSparkUser method > --------------------------------------------------------- > > Key: SPARK-31941 > URL: https://issues.apache.org/jira/browse/SPARK-31941 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.1.0 > Reporter: Saurabh Chawla > Priority: Minor > > After SPARK-31632 SparkException is thrown from def applicationInfo( > {code:java} > def applicationInfo(): v1.ApplicationInfo = { > try { > // The ApplicationInfo may not be available when Spark is starting up. > > store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info > } catch { > case _: NoSuchElementException => > throw new SparkException("Failed to get the application information. > " + > "If you are starting up Spark, please wait a while until it's > ready.") > } > } > {code} > Where as the caller for this method def getSparkUser in Spark UI is not > handling SparkException in the catch > {code:java} > def getSparkUser: String = { > try { > Option(store.applicationInfo().attempts.head.sparkUser) > > .orElse(store.environmentInfo().systemProperties.toMap.get("user.name")) > .getOrElse("<unknown>") > } catch { > case _: NoSuchElementException => "<unknown>" > } > } > {code} > So On using this method (getSparkUser )we can get the application erred out. > So either we should thow > {code:java} > throw new NoSuchElementException("Failed to get the application information. > " + > "If you are starting up Spark, please wait a while until it's > ready.") > {code} > or else add the scenario to catch spark exception in getSparkUser > case _: SparkException => "<unknown>" -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org