[ https://issues.apache.org/jira/browse/SPARK-17495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17059844#comment-17059844 ]
Takeshi Yamamuro commented on SPARK-17495: ------------------------------------------ I checked the commit logs; {code:java} $git checkout v2.1.0 $git log --pretty=oneline | grep SPARK-17495 a99743d053e84f695dc3034550939555297b0a05 [SPARK-17495][SQL] Add Hash capability semantically equivalent to Hive's $git checkout v2.2.0 $git log --pretty=oneline | grep SPARK-17495 9456688547522a62f1e7520e9b3564550c57aa5d [SPARK-17495][SQL] Support date, timestamp and interval types in Hive hash 2a0bc867a4a1dad4ecac47701199e540d345ff4f [SPARK-17495][SQL] Support Decimal type in Hive-hash 3e40f6c3d6fc0bcd828d09031fa3994925394889 [SPARK-17495][SQL] Add more tests for hive hash a99743d053e84f695dc3034550939555297b0a05 [SPARK-17495][SQL] Add Hash capability semantically equivalent to Hive's {code} I think the essential parts to fix this issue had been merged in branch-2.2 (the v2.2.0 release), so I will close this again with `Fix Version/s`=2.2.0. If there is any problem, please fix this. [~dongjoon] [~hyukjin.kwon] [~viirya] @Tejas Patil If you have more work on this, please open a new jira. [~FelixKJose] Yea, if you use spark-2.4.x, you've already used this fix. > Hive hash implementation > ------------------------ > > Key: SPARK-17495 > URL: https://issues.apache.org/jira/browse/SPARK-17495 > Project: Spark > Issue Type: Sub-task > Components: SQL > Reporter: Tejas Patil > Assignee: Tejas Patil > Priority: Minor > Labels: bulk-closed > > Spark internally uses Murmur3Hash for partitioning. This is different from > the one used by Hive. For queries which use bucketing this leads to different > results if one tries the same query on both engines. For us, we want users to > have backward compatibility to that one can switch parts of applications > across the engines without observing regressions. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org