[ 
https://issues.apache.org/jira/browse/ARROW-11450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17278357#comment-17278357
 ] 

Joris Van den Bossche commented on ARROW-11450:
-----------------------------------------------

Indeed, doing such pinning in your own CI of course doesn't prevent users 
bumping into this incompatible version combo. Unfortunately, I am not sure it 
easy to fix with version constraints (retro-actively). Even if you would do a 
bugfix release now to lower the numpy bound, a solver would probably find the 
older version of beam that doesn't have this bound when it wants to install 
beam together with numpy 1.20, in which case you can still have the same issue. 
(a new beam (bugfix) release to remove the pyarrow < 3.0 bound might be more 
effective, though) 

> [Python] pyarrow<3 incompatible with numpy>=1.20.0
> --------------------------------------------------
>
>                 Key: ARROW-11450
>                 URL: https://issues.apache.org/jira/browse/ARROW-11450
>             Project: Apache Arrow
>          Issue Type: Bug
>         Environment: python>=3.7
> Debian 5.7 x86_64
>            Reporter: Jongbin Park
>            Priority: Major
>             Fix For: 3.0.0
>
>
> pyarrow 1.0 and 2.0 is not compatible with numpy 1.20.0
> Running the following command would fail:
> {{pa.array(np.arange(10))}}
> with error
> {{pyarrow.lib.ArrowTypeError: Did not pass numpy.dtype object}}
> Numpy release note [[link|#compatibility-notes]]] mentions about the np.dtype 
> related compatibility breaks. Also there is a C API change, which implies 
> numpy dependency constraints should be tighter (whether <=1.20 or >1.20) 
> depending on the compiled numpy version (if pyarrow is depending on it; I'm 
> not aware of the pyarrow internal).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to