[ 
https://issues.apache.org/jira/browse/HIVE-9251?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xuefu Zhang updated HIVE-9251:
------------------------------
       Resolution: Fixed
    Fix Version/s: spark-branch
           Status: Resolved  (was: Patch Available)

Committed to spark branch. Thanks, Rui.

> SetSparkReducerParallelism is likely to set too small number of reducers 
> [Spark Branch]
> ---------------------------------------------------------------------------------------
>
>                 Key: HIVE-9251
>                 URL: https://issues.apache.org/jira/browse/HIVE-9251
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Rui Li
>            Assignee: Rui Li
>             Fix For: spark-branch
>
>         Attachments: HIVE-9251.1-spark.patch, HIVE-9251.2-spark.patch, 
> HIVE-9251.3-spark.patch, HIVE-9251.4-spark.patch, HIVE-9251.5-spark.patch, 
> HIVE-9251.6-spark.patch
>
>
> This may hurt performance or even lead to task failures. For example, spark's 
> netty-based shuffle limits the max frame size to be 2G.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to