[ 
https://issues.apache.org/jira/browse/ARROW-11450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17278331#comment-17278331
 ] 

Brian Hulette commented on ARROW-11450:
---------------------------------------

You're right, in our older releases we should have anticipated an issue like 
this and made our numpy bound <1.20.0, and if it's a problem we could *also* 
make a bugfix release to lower the bound (although we have the same problem 
with our release process).

I'm actually more concerned about what we're going to do for future releases. 
We're trying to keep our version spec as wide as possible, and testing against 
pyarrow 1.x, 2.x, (and soon 3.x) so our users and downstream libraries don't 
need to upgrade in lockstep with us. I think we'll just have to keep a numpy 
<1.20.0 bound until we can drop 1.x, 2.x support. Experienced users could 
always override it.

> [Python] pyarrow<3 incompatible with numpy>=1.20.0
> --------------------------------------------------
>
>                 Key: ARROW-11450
>                 URL: https://issues.apache.org/jira/browse/ARROW-11450
>             Project: Apache Arrow
>          Issue Type: Bug
>         Environment: python>=3.7
> Debian 5.7 x86_64
>            Reporter: Jongbin Park
>            Priority: Major
>             Fix For: 3.0.0
>
>
> pyarrow 1.0 and 2.0 is not compatible with numpy 1.20.0
> Running the following command would fail:
> {{pa.array(np.arange(10))}}
> with error
> {{pyarrow.lib.ArrowTypeError: Did not pass numpy.dtype object}}
> Numpy release note [[link|#compatibility-notes]]] mentions about the np.dtype 
> related compatibility breaks. Also there is a C API change, which implies 
> numpy dependency constraints should be tighter (whether <=1.20 or >1.20) 
> depending on the compiled numpy version (if pyarrow is depending on it; I'm 
> not aware of the pyarrow internal).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to