[ 
https://issues.apache.org/jira/browse/SPARK-21143?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16054592#comment-16054592
 ] 

Shixiong Zhu commented on SPARK-21143:
--------------------------------------

As Netty is so core to Spark, it's too risky to upgrade from 4.0.X to 4.1.X. 
There is no plan right now.

> Fail to fetch blocks >1MB in size in presence of conflicting Netty version
> --------------------------------------------------------------------------
>
>                 Key: SPARK-21143
>                 URL: https://issues.apache.org/jira/browse/SPARK-21143
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.1
>            Reporter: Ryan Williams
>            Priority: Minor
>
> One of my spark libraries inherited a transitive-dependency on Netty 
> 4.1.6.Final (vs. Spark's 4.0.42.Final), and I observed a strange failure I 
> wanted to document: fetches of blocks larger than 1MB (pre-compression, 
> afaict) seem to trigger a code path that results in {{AbstractMethodError}}'s 
> and ultimately stage failures.
> I put a minimal repro in [this github 
> repo|https://github.com/ryan-williams/spark-bugs/tree/netty]: {{collect}} on 
> a 1-partition RDD with 1032 {{Array\[Byte\]}}'s of size 1000 works, but at 
> 1033 {{Array}}'s it dies in a confusing way.
> Not sure what fixing/mitigating this in Spark would look like, other than 
> defensively shading+renaming netty.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to