[ 
https://issues.apache.org/jira/browse/SPARK-33279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17223009#comment-17223009
 ] 

Dongjoon Hyun commented on SPARK-33279:
---------------------------------------

Hi, [~fan_li_ya]. In general, Apache Spark 3.0.0 was released before Arrow 2.0. 
`branch-3.0` only accepts bug fixes.
Arrow 2.0 sounds like lots of new features instead of bug fixes.

In addition, Apache Spark 3.1 is scheduled on January 2021. Apache Spark 3.0.2 
is similar. So, I'd like to recommend to use Apache Spark 3.1.

> Spark 3.0 failure due to lack of Arrow dependency
> -------------------------------------------------
>
>                 Key: SPARK-33279
>                 URL: https://issues.apache.org/jira/browse/SPARK-33279
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Liya Fan
>            Priority: Major
>
> A recent change in Arrow has split the arrow-memory module into 3, so client 
> code must add a dependency of arrow-memory-netty (or arrow-memory-unsafe).
> This has been done in the master branch of Spark, but not in the branch-3.0 
> branch, this is causing the build in branch-3.0 to fail 
> (https://github.com/ursa-labs/crossbow/actions?query=branch:actions-681-github-test-conda-python-3.7-spark-branch-3.0)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to