[ https://issues.apache.org/jira/browse/HIVE-8649?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xuefu Zhang updated HIVE-8649: ------------------------------ Fix Version/s: (was: spark-branch) 1.1.0 > Increase level of parallelism in reduce phase [Spark Branch] > ------------------------------------------------------------ > > Key: HIVE-8649 > URL: https://issues.apache.org/jira/browse/HIVE-8649 > Project: Hive > Issue Type: Sub-task > Components: Spark > Reporter: Brock Noland > Assignee: Jimmy Xiang > Fix For: 1.1.0 > > Attachments: HIVE-8649.1-spark.patch, HIVE-8649.2-spark.patch > > > We calculate the number of reducers based on the same code for MapReduce. > However, reducers are vastly cheaper in Spark and it's generally recommended > we have many more reducers than in MR. > Sandy Ryza who works on Spark has some ideas about a heuristic. -- This message was sent by Atlassian JIRA (v6.3.4#6332)