[ 
https://issues.apache.org/jira/browse/SPARK-4852?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14561518#comment-14561518
 ] 

Yin Huai commented on SPARK-4852:
---------------------------------

[~lian cheng] What is the plan for this? Update kryo or ask developer to 
manually copy kryo 2.21 jar (per instruction in 
https://github.com/apache/spark/tree/master/sql)?

> Hive query plan deserialization failure caused by shaded hive-exec jar file 
> when generating golden answers
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4852
>                 URL: https://issues.apache.org/jira/browse/SPARK-4852
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: Cheng Lian
>            Priority: Minor
>
> When adding Hive 0.13.1 support for Spark SQL Thrift server in PR 
> [2685|https://github.com/apache/spark/pull/2685], Kryo 2.22 used by original 
> hive-exec-0.13.1.jar was shaded by Kryo 2.21 used by Spark SQL because of 
> dependency hell. Unfortunately, Kryo 2.21 has a known bug that may cause Hive 
> query plan deserialization failure. This bug was fixed in Kryo 2.22.
> Normally, this issue doesn't affect Spark SQL because we don't even generate 
> Hive query plan. But when running Hive test suites like 
> {{HiveCompatibilitySuite}}, golden answer files must be generated by Hive, 
> and thus triggers this issue. A workaround is to replace 
> {{hive-exec-0.13.1.jar}} under {{$HIVE_HOME/lib}} with Spark's 
> {{hive-exec-0.13.1a.jar}} and {{kryo-2.21.jar}} under 
> {{$SPARK_DEV_HOME/lib_managed/jars}}. Then add {{$HIVE_HOME/lib}} to 
> {{$HADOOP_CLASSPATH}}.
> Upgrading to some newer version of Kryo which is binary compatible with Kryo 
> 2.22 (if there is one) may fix this issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to