[ https://issues.apache.org/jira/browse/HIVE-12880?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sergey Shelukhin updated HIVE-12880: ------------------------------------ Attachment: HIVE-12880.patch The patch. spark-assembly will be added if either spark_home is set by user, or if we find spark_home AND skip is set to false by user. If nothing is set explicitly but we find spark, it emits a warning. [~xuefuz] does it make sense? > spark-assembly causes Hive class version problems > ------------------------------------------------- > > Key: HIVE-12880 > URL: https://issues.apache.org/jira/browse/HIVE-12880 > Project: Hive > Issue Type: Bug > Reporter: Hui Zheng > Assignee: Sergey Shelukhin > Attachments: HIVE-12880.patch > > > It looks like spark-assembly contains versions of Hive classes (e.g. > HiveConf), and these sometimes (always?) come from older versions of Hive. > We've seen problems where depending on classpath perturbations, NoSuchField > errors may be thrown for recently added ConfVars because the HiveConf class > comes from spark-assembly. > Would making sure spark-assembly comes last in the classpath solve the > problem? > Otherwise, can we depend on something that does not package older Hive > classes? > Currently, HIVE-12179 provides a workaround (in non-Spark use case, at least; > I am assuming this issue can also affect Hive-on-Spark). -- This message was sent by Atlassian JIRA (v6.3.4#6332)