[ 
https://issues.apache.org/jira/browse/SPARK-21349?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16079751#comment-16079751
 ] 

Dongjoon Hyun commented on SPARK-21349:
---------------------------------------

This issue is not about blindly raising the threashold, 100K. The default value 
will be the same for all users. What I mean is the size of task become bigger 
now than 3 years ago. Currenlty, it complains more frequently and misleadingly.

Is there any reason or criteria to decide 100K as the threadhold at that time 
three years ago? Then, at least, can we evaluate the threshold?

> Make TASK_SIZE_TO_WARN_KB configurable
> --------------------------------------
>
>                 Key: SPARK-21349
>                 URL: https://issues.apache.org/jira/browse/SPARK-21349
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.3, 2.2.0
>            Reporter: Dongjoon Hyun
>            Priority: Minor
>
> Since Spark 1.1.0, Spark emits warning when task size exceeds a threshold, 
> SPARK-2185. Although this is just a warning message, this issue tries to make 
> `TASK_SIZE_TO_WARN_KB` into a normal Spark configuration for advanced users.
> According to the Jenkins log, we also have 123 warnings even in our unit test.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to