[ 
https://issues.apache.org/jira/browse/SPARK-21349?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16080875#comment-16080875
 ] 

Shivaram Venkataraman commented on SPARK-21349:
-----------------------------------------------

Well 100K is already too large IMHO and I'm not sure adding another config 
property is really helping things just to silence some log messages.  Looking 
at the code it seems that the larger task sizes mostly stem from the 
TaskMetrics objects getting bigger -- especially with a number of new SQL 
metrics being added. I think the right fix here is to improve the serialization 
of TaskMetrics (especially if the structure is empty, why bother sending 
anything at all to the worker ?)

> Make TASK_SIZE_TO_WARN_KB configurable
> --------------------------------------
>
>                 Key: SPARK-21349
>                 URL: https://issues.apache.org/jira/browse/SPARK-21349
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.3, 2.2.0
>            Reporter: Dongjoon Hyun
>            Priority: Minor
>
> Since Spark 1.1.0, Spark emits warning when task size exceeds a threshold, 
> SPARK-2185. Although this is just a warning message, this issue tries to make 
> `TASK_SIZE_TO_WARN_KB` into a normal Spark configuration for advanced users.
> According to the Jenkins log, we also have 123 warnings even in our unit test.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to