legout commented on issue #34363:
URL: https://github.com/apache/arrow/issues/34363#issuecomment-1448435389

   I think it is somehow related to R2, because I am able to run this script 
(https://github.com/apache/arrow/issues/34363#issue-1601167816) using other s3 
object stores (tested with wasabi, contabo s3, idrive e2, storj, aws s3 and 
self hosted minio) without any problems.
   
   One more information. I am even able to upload the `large_data.parquet` 
using `s3fs` with  `fs1.put_file("large_data.parquet", "test/test.parquet")` 
and it is also possible with the AWS cli (also uses [the 
SDK](https://aws.amazon.com/sdk-for-cpp/) ??).
   
   I do understand, that this issue can not be solved within arrow, therefore, 
we can probably close this here. 
   However, I´d like to find out what causes this error.  Is it possible to run 
pyarrow commands in a "debugging mode" to get more details?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to