[ 
https://issues.apache.org/jira/browse/ARROW-11450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17278338#comment-17278338
 ] 

Joris Van den Bossche commented on ARROW-11450:
-----------------------------------------------

bq.  We're trying to keep our version spec as wide as possible, and testing 
against pyarrow 1.x, 2.x, (and soon 3.x) so our users and downstream libraries 
don't need to upgrade in lockstep with us. I think we'll just have to keep a 
numpy <1.20.0 bound until we can drop 1.x, 2.x support.

Generally speaking (not knowing beam specifically here), I think it should 
still be possible to support this wide range, and for testing you can pin numpy 
< 1.20 for the builds testing pyarrow 1.x and 2.x, and not pin numpy on the 
build testing pyarrow 3.x (without needing to require numpy < 1.20 generally)? 
(at least that's what we do for pandas) 

> [Python] pyarrow<3 incompatible with numpy>=1.20.0
> --------------------------------------------------
>
>                 Key: ARROW-11450
>                 URL: https://issues.apache.org/jira/browse/ARROW-11450
>             Project: Apache Arrow
>          Issue Type: Bug
>         Environment: python>=3.7
> Debian 5.7 x86_64
>            Reporter: Jongbin Park
>            Priority: Major
>             Fix For: 3.0.0
>
>
> pyarrow 1.0 and 2.0 is not compatible with numpy 1.20.0
> Running the following command would fail:
> {{pa.array(np.arange(10))}}
> with error
> {{pyarrow.lib.ArrowTypeError: Did not pass numpy.dtype object}}
> Numpy release note [[link|#compatibility-notes]]] mentions about the np.dtype 
> related compatibility breaks. Also there is a C API change, which implies 
> numpy dependency constraints should be tighter (whether <=1.20 or >1.20) 
> depending on the compiled numpy version (if pyarrow is depending on it; I'm 
> not aware of the pyarrow internal).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to