[ 
https://issues.apache.org/jira/browse/SPARK-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15130884#comment-15130884
 ] 

Marcelo Vanzin commented on SPARK-12807:
----------------------------------------

- Guava is shaded in the final jar
- Netty is a different version (4.x for Spark vs. 3.x for Hadoop) which lives 
in a different package, so fortunately no conflict
- leveldb was intentionally kept at the same version as Hadoop, exactly because 
of the JNI bits
- the javax annotations are not used at runtime so there's no need to shade them

That leaves only Jackson; in hindsight it would have been better not to include 
it in the shuffle service, but well, that's water under the bridge now. So that 
leaves two solutions:

- park it at the same version as Hadoop; assuming different versions of Hadoop 
use the same version...
- relocate it

Relocation is probably the safest choice given the many versions of Hadoop 
people may be using.

> Spark External Shuffle not working in Hadoop clusters with Jackson 2.2.3
> ------------------------------------------------------------------------
>
>                 Key: SPARK-12807
>                 URL: https://issues.apache.org/jira/browse/SPARK-12807
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, YARN
>    Affects Versions: 1.6.0
>         Environment: A Hadoop cluster with Jackson 2.2.3, spark running with 
> dynamic allocation enabled
>            Reporter: Steve Loughran
>            Priority: Critical
>
> When you try to try to use dynamic allocation on a Hadoop 2.6-based cluster, 
> you get to see a stack trace in the NM logs, indicating a jackson 2.x version 
> mismatch.
> (reported on the spark dev list)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to