Hi all, The next release of Apache Arrow v13.0.0 coming this month[1] has upgraded Netty to v4.1.94.Final[2] due to a moderate severity CVE[3]. We are seeing that Spark using Netty v4.1.93.Final is not compatible with Arrow v13.0.0, throwing an exception at runtime[4]. There has been some talk in a Spark PR about upgrading to Netty v4.1.94.Final once the new arrow-memory-netty is released[5].
Should the Spark POM be updated to shade arrow-memory-netty? Thanks, Dane [1] https://lists.apache.org/thread/f9r0dsd65ohdtcvc7fnnlfs23n3z0n7f [2] https://github.com/apache/arrow/pull/36211 [3] https://github.com/advisories/GHSA-6mjq-h674-j845 [4] https://github.com/apache/arrow/issues/36332 [5] https://github.com/apache/spark/pull/41681