[ 
https://issues.apache.org/jira/browse/SPARK-23385?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-23385.
-------------------------------
    Resolution: Won't Fix

> Allow SparkUITab to be customized adding in SparkConf and loaded when 
> creating SparkUI
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-23385
>                 URL: https://issues.apache.org/jira/browse/SPARK-23385
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 2.2.1
>            Reporter: Lantao Jin
>            Priority: Major
>
> It would be nice if there was a mechanism to allow to add customized 
> SparkUITab (embedded like Jobs, Stages, Storage, Environment, Executors,...) 
> to be registered through SparkConf settings. This would be more flexible when 
> we need display some special information in UI rather than adding the 
> embedded one by one and wait community to merge.
> I propose to introduce a new configuration option, spark.extraUITabs, that 
> allows customized WebUITab to be specified in SparkConf and registered when 
> SparkUI is created. Here is the proposed documentation for the new option:
> {quote}
> A comma-separated list of classes that implement SparkUITab; when 
> initializing SparkUI, instances of these classes will be created and 
> registered to the tabs array in SparkUI. If a class has a two-argument 
> constructor that accepts a SparkUI and AppStatusStore, that constructor will 
> be called; If a class has a single-argument constructor that accepts a 
> SparkUI; otherwise, a zero-argument constructor will be called. If no valid 
> constructor can be found, the SparkUI creation will fail with an exception.
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to