[ 
https://issues.apache.org/jira/browse/ARROW-11147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17259856#comment-17259856
 ] 

Joris Van den Bossche commented on ARROW-11147:
-----------------------------------------------

Maybe just pin to 0.25.3 for both master/latest, so the "if" check can be 
removed. And then afterwards we can update it to just use the latest pandas 
(because I assume the pin to 0.25.3 is no longer needed for the reasons stated 
there: "The Spark tests currently break with pandas >= 1.0", I would assume 
this is fixed by now)

> Parquet tests failing in nightly build with Dask master
> -------------------------------------------------------
>
>                 Key: ARROW-11147
>                 URL: https://issues.apache.org/jira/browse/ARROW-11147
>             Project: Apache Arrow
>          Issue Type: Bug
>            Reporter: Andrew Wieteska
>            Priority: Major
>
>  
> || ||
> | 
> FAILED 
> opt/conda/envs/arrow/lib/python3.8/site-packages/dask/dataframe/io/tests/test_parquet.py::test_categorical[pyarrow-dataset-pyarrow-legacy]
>  
> FAILED 
> opt/conda/envs/arrow/lib/python3.8/site-packages/dask/dataframe/io/tests/test_parquet.py::test_categorical[pyarrow-legacy-pyarrow-dataset]
>  
> FAILED 
> opt/conda/envs/arrow/lib/python3.8/site-packages/dask/dataframe/io/tests/test_parquet.py::test_categorical[pyarrow-legacy-pyarrow-legacy]
> FAILED 
> opt/conda/envs/arrow/lib/python3.8/site-packages/dask/dataframe/io/tests/test_parquet.py::test_categorical[pyarrow-dataset-pyarrow-dataset]
> = 4 failed, 365 passed, 290 skipped, 18 deselected, 2 xfailed, 25 warnings in 
> 43.84s  |
> cc [~jorisvandenbossche]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to