[ https://issues.apache.org/jira/browse/SPARK-31941?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kousuke Saruta updated SPARK-31941: ----------------------------------- Fix Version/s: 2.4.7 3.1.0 3.0.1 > Handling the exception in SparkUI for getSparkUser method > --------------------------------------------------------- > > Key: SPARK-31941 > URL: https://issues.apache.org/jira/browse/SPARK-31941 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.4.6, 3.0.0, 3.1.0 > Reporter: Saurabh Chawla > Assignee: Saurabh Chawla > Priority: Minor > Fix For: 3.0.1, 3.1.0, 2.4.7 > > > After SPARK-31632 SparkException is thrown from def applicationInfo( > {code:java} > def applicationInfo(): v1.ApplicationInfo = { > try { > // The ApplicationInfo may not be available when Spark is starting up. > > store.view(classOf[ApplicationInfoWrapper]).max(1).iterator().next().info > } catch { > case _: NoSuchElementException => > throw new SparkException("Failed to get the application information. > " + > "If you are starting up Spark, please wait a while until it's > ready.") > } > } > {code} > Where as the caller for this method def getSparkUser in Spark UI is not > handling SparkException in the catch > {code:java} > def getSparkUser: String = { > try { > Option(store.applicationInfo().attempts.head.sparkUser) > > .orElse(store.environmentInfo().systemProperties.toMap.get("user.name")) > .getOrElse("<unknown>") > } catch { > case _: NoSuchElementException => "<unknown>" > } > } > {code} > So On using this method (getSparkUser )we can get the application erred out. > So either we should thow > {code:java} > throw new NoSuchElementException("Failed to get the application information. > " + > "If you are starting up Spark, please wait a while until it's > ready.") > {code} > or else add the scenario to catch spark exception in getSparkUser > case _: SparkException => "<unknown>" -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org