[ https://issues.apache.org/jira/browse/HIVE-14029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15544407#comment-15544407 ]
Lefty Leverenz commented on HIVE-14029: --------------------------------------- [~Ferd] and [~lirui], yes we should add a section on Spark versions that are compatible with different Hive releases, and include as much information as possible. * [Hive on Spark: Getting Started | https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started] I was about to add such a section at the beginning of the doc (before Spark Installation) but hesitated because I don't know what version(s) can be used with the installation instructions. > Update Spark version to 2.0.0 > ----------------------------- > > Key: HIVE-14029 > URL: https://issues.apache.org/jira/browse/HIVE-14029 > Project: Hive > Issue Type: Bug > Reporter: Ferdinand Xu > Assignee: Ferdinand Xu > Labels: Incompatible, TODOC2.2 > Fix For: 2.2.0 > > Attachments: HIVE-14029.1.patch, HIVE-14029.2.patch, > HIVE-14029.3.patch, HIVE-14029.4.patch, HIVE-14029.5.patch, > HIVE-14029.6.patch, HIVE-14029.7.patch, HIVE-14029.8.patch, HIVE-14029.patch > > > There are quite some new optimizations in Spark 2.0.0. We need to bump up > Spark to 2.0.0 to benefit those performance improvements. > To update Spark version to 2.0.0, the following changes are required: > * Spark API updates: > ** SparkShuffler#call return Iterator instead of Iterable > ** SparkListener -> JavaSparkListener > ** InputMetrics constructor doesn’t accept readMethod > ** Method remoteBlocksFetched and localBlocksFetched in ShuffleReadMetrics > return long type instead of integer > * Dependency upgrade: > ** Jackson: 2.4.2 -> 2.6.5 > ** Netty version: 4.0.23.Final -> 4.0.29.Final > ** Scala binary version: 2.10 -> 2.11 > ** Scala version: 2.10.4 -> 2.11.8 -- This message was sent by Atlassian JIRA (v6.3.4#6332)