[ 
https://issues.apache.org/jira/browse/SPARK-33279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17224420#comment-17224420
 ] 

Bryan Cutler commented on SPARK-33279:
--------------------------------------

[~fan_li_ya] we should change the Arrow-Spark integration tests so that it 
doesn't try to build with the latest Arrow Java, and instead just test the 
latest pyarrow, which should work.

> Spark 3.0 failure due to lack of Arrow dependency
> -------------------------------------------------
>
>                 Key: SPARK-33279
>                 URL: https://issues.apache.org/jira/browse/SPARK-33279
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Liya Fan
>            Priority: Major
>
> A recent change in Arrow has split the arrow-memory module into 3, so client 
> code must add a dependency of arrow-memory-netty (or arrow-memory-unsafe).
> This has been done in the master branch of Spark, but not in the branch-3.0 
> branch, this is causing the build in branch-3.0 to fail 
> (https://github.com/ursa-labs/crossbow/actions?query=branch:actions-681-github-test-conda-python-3.7-spark-branch-3.0)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to