[ 
https://issues.apache.org/jira/browse/SPARK-31380?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Srinivas Rishindra Pothireddi updated SPARK-31380:
--------------------------------------------------
    Description: 
Peak Execution Memory Quantile is displayed in the regular Spark UI correctly. 
If the same application is viewed in Spark History Server UI, Peak Execution 
Memory is always displayed as zero.

Spark event log for the application seem to contain Peak Execution Memory(under 
the tag "internal.metrics.peakExecutionMemory") correctly.  However this is not 
reflected in the History Server UI.

*Steps to produce non-zero peakExecutionMemory*

spark.range(0, 200000).map\{x => (x , x % 20)}.toDF("a", 
"b").createOrReplaceTempView("fred")

spark.range(0, 200000).map\{x => (x , x + 1)}.toDF("a", 
"b").createOrReplaceTempView("phil")

sql("select p.**,* f*.** from phil p join fred f on f.b = p.b").count

 

  was:
Peak Execution Memory Quantile is displayed in the regular Spark UI correctly. 
If the same application is viewed in Spark History Server UI, Peak Execution 
Memory is always displayed as zero.

Spark event log for the application seem to contain Peak Execution Memory(under 
the tag "internal.metrics.peakExecutionMemory") correctly.  However this is not 
reflected in the history server UI.

*Steps to produce non-zero peakExecutionMemory*

spark.range(0, 200000).map\{x => (x , x % 20)}.toDF("a", 
"b").createOrReplaceTempView("fred")

spark.range(0, 200000).map\{x => (x , x + 1)}.toDF("a", 
"b").createOrReplaceTempView("phil")

sql("select p.**,* f*.** from phil p join fred f on f.b = p.b").count

 


> Peak Execution Memory Quantile is not displayed in Spark History Server UI
> --------------------------------------------------------------------------
>
>                 Key: SPARK-31380
>                 URL: https://issues.apache.org/jira/browse/SPARK-31380
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, Web UI
>    Affects Versions: 3.0.0
>            Reporter: Srinivas Rishindra Pothireddi
>            Priority: Major
>
> Peak Execution Memory Quantile is displayed in the regular Spark UI 
> correctly. If the same application is viewed in Spark History Server UI, Peak 
> Execution Memory is always displayed as zero.
> Spark event log for the application seem to contain Peak Execution 
> Memory(under the tag "internal.metrics.peakExecutionMemory") correctly.  
> However this is not reflected in the History Server UI.
> *Steps to produce non-zero peakExecutionMemory*
> spark.range(0, 200000).map\{x => (x , x % 20)}.toDF("a", 
> "b").createOrReplaceTempView("fred")
> spark.range(0, 200000).map\{x => (x , x + 1)}.toDF("a", 
> "b").createOrReplaceTempView("phil")
> sql("select p.**,* f*.** from phil p join fred f on f.b = p.b").count
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to