[ 
https://issues.apache.org/jira/browse/SPARK-29043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16927402#comment-16927402
 ] 

feiwang edited comment on SPARK-29043 at 9/11/19 8:41 AM:
----------------------------------------------------------

[~kabhwan]
* How long "spark.history.fs.update.interval" has been set?    20s
* How many applications are reloaded per each call of checkForLogs?   50000+
* How big the event log for each application is?    there maybe many large logs.

I think SPARK-28594 is more helpful for our case.


was (Author: hzfeiwang):
* How long "spark.history.fs.update.interval" has been set?    20s
* How many applications are reloaded per each call of checkForLogs?   50000+
* How big the event log for each application is?    there maybe many large logs.

I think SPARK-28594 is more helpful for our case.

> [History Server]Only one replay thread of FsHistoryProvider work because of 
> straggler
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-29043
>                 URL: https://issues.apache.org/jira/browse/SPARK-29043
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.4
>            Reporter: feiwang
>            Priority: Major
>         Attachments: image-2019-09-11-15-09-22-912.png, 
> image-2019-09-11-15-10-25-326.png, screenshot-1.png
>
>
> As shown in the attachment, we set spark.history.fs.numReplayThreads=30 for 
> spark history server.
> However, there is only one replay thread work because of straggler.
> Let's check the code.
> https://github.com/apache/spark/blob/7f36cd2aa5e066a807d498b8c51645b136f08a75/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala#L509-L547
> There is a synchronous operation for all replay tasks.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to