[ 
https://issues.apache.org/jira/browse/SPARK-12807?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15134412#comment-15134412
 ] 

Steve Loughran commented on SPARK-12807:
----------------------------------------

It's good to hear that things work with older versions, but they do need to be 
compiled in sync. The risk with downgrading jackson versions  is that someone 
who has upgraded will find their code won't link any more. This is the same 
dilemma that HADOOP-10104 created: revert or tell others "sorry, time to 
upgrade". We went with the latter, but have added jackson to the list of 
dependencies whose upgrades are traumatic: Guava, protobuf (which will never be 
upgraded on the 2.x line)

> Spark External Shuffle not working in Hadoop clusters with Jackson 2.2.3
> ------------------------------------------------------------------------
>
>                 Key: SPARK-12807
>                 URL: https://issues.apache.org/jira/browse/SPARK-12807
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, YARN
>    Affects Versions: 1.6.0
>         Environment: A Hadoop cluster with Jackson 2.2.3, spark running with 
> dynamic allocation enabled
>            Reporter: Steve Loughran
>            Priority: Critical
>
> When you try to try to use dynamic allocation on a Hadoop 2.6-based cluster, 
> you get to see a stack trace in the NM logs, indicating a jackson 2.x version 
> mismatch.
> (reported on the spark dev list)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to