[ https://issues.apache.org/jira/browse/HIVE-14029?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15583393#comment-15583393 ]
Sergio Peña commented on HIVE-14029: ------------------------------------ Interesting, so even between Spark 1.x versions, Hive wasn't compatible at all with them? This is going to be a lot of work as you said. If Spark 2.1 isn't compatible with Spark 2.0 for instance, then we will have a shim layer with minor changes per Spark version to keep compatibility. [~xuefuz] Were there users in the community complaining about Spark 1.x incompatibilities with Hive in the past? > Update Spark version to 2.0.0 > ----------------------------- > > Key: HIVE-14029 > URL: https://issues.apache.org/jira/browse/HIVE-14029 > Project: Hive > Issue Type: Bug > Reporter: Ferdinand Xu > Assignee: Ferdinand Xu > Labels: Incompatible, TODOC2.2 > Fix For: 2.2.0 > > Attachments: HIVE-14029.1.patch, HIVE-14029.2.patch, > HIVE-14029.3.patch, HIVE-14029.4.patch, HIVE-14029.5.patch, > HIVE-14029.6.patch, HIVE-14029.7.patch, HIVE-14029.8.patch, HIVE-14029.patch > > > There are quite some new optimizations in Spark 2.0.0. We need to bump up > Spark to 2.0.0 to benefit those performance improvements. > To update Spark version to 2.0.0, the following changes are required: > * Spark API updates: > ** SparkShuffler#call return Iterator instead of Iterable > ** SparkListener -> JavaSparkListener > ** InputMetrics constructor doesn’t accept readMethod > ** Method remoteBlocksFetched and localBlocksFetched in ShuffleReadMetrics > return long type instead of integer > * Dependency upgrade: > ** Jackson: 2.4.2 -> 2.6.5 > ** Netty version: 4.0.23.Final -> 4.0.29.Final > ** Scala binary version: 2.10 -> 2.11 > ** Scala version: 2.10.4 -> 2.11.8 -- This message was sent by Atlassian JIRA (v6.3.4#6332)