[ https://issues.apache.org/jira/browse/SPARK-31380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17085138#comment-17085138 ]
Srinivas Rishindra Pothireddi edited comment on SPARK-31380 at 4/16/20, 6:05 PM: --------------------------------------------------------------------------------- I tested this with spark master at the time of creating this ticket. I am not seeing this issue again now and spark-3.0.0-preview2. I guess the issue might have gone away recently. The issue definitely exists with spark-2.4.5. The history server might be flaky. That could be a reason why we are seeing this issue intermittently. For example when I tried to run my application to test it I am able to see the metrics in safari but not in chrome. !image-2020-04-16-11-04-59-036.png! was (Author: sririshindra): I tested this with spark master at the time of creating this ticket. I am not seeing this issue again now and spark-3.0.0-preview2. I guess the issue might have gone away recently. The issue definitely exists with spark-2.4.5. The history server might be flaky. That could be a reason why we are seeing this issue intermittently. For example when I tried to run my application to test it I am able to see the metrics in safari but not in chrome. !Screen Shot 2020-04-16 at 10.55.17 AM.png! !Screen Shot 2020-04-16 at 10.57.14 AM.png! > Peak Execution Memory Quantile is not displayed in Spark History Server UI > -------------------------------------------------------------------------- > > Key: SPARK-31380 > URL: https://issues.apache.org/jira/browse/SPARK-31380 > Project: Spark > Issue Type: Bug > Components: Spark Core, Web UI > Affects Versions: 3.0.0 > Reporter: Srinivas Rishindra Pothireddi > Priority: Major > Attachments: image-2020-04-15-18-16-18-254.png, > image-2020-04-16-11-04-58-953.png, image-2020-04-16-11-04-59-036.png > > > Peak Execution Memory Quantile is displayed in the regular Spark UI > correctly. If the same application is viewed in Spark History Server UI, Peak > Execution Memory is always displayed as zero. > Spark event log for the application seem to contain Peak Execution > Memory(under the tag "internal.metrics.peakExecutionMemory") correctly. > However this is not reflected in the History Server UI. > *Steps to produce non-zero Peak Execution Memory* > spark.range(0, 200000).map\{x => (x , x % 20)}.toDF("a", > "b").createOrReplaceTempView("fred") > spark.range(0, 200000).map\{x => (x , x + 1)}.toDF("a", > "b").createOrReplaceTempView("phil") > sql("select p.**,* f.* from phil p join fred f on f.b = p.b").count > -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org