AlenkaF commented on code in PR #48619: URL: https://github.com/apache/arrow/pull/48619#discussion_r2685603434
########## docs/source/python/parquet.rst: ########## @@ -405,11 +498,11 @@ individual table writes are wrapped using ``with`` statements so the .. code-block:: python - # Remote file-system example - from pyarrow.fs import HadoopFileSystem - fs = HadoopFileSystem(host, port, user=user, kerb_ticket=ticket_cache_path) - pq.write_to_dataset(table, root_path='dataset_name', - partition_cols=['one', 'two'], filesystem=fs) + >>> # Remote file-system example + >>> from pyarrow.fs import HadoopFileSystem Review Comment: HDFS (and S3) should be built in (wheels example: https://github.com/apache/arrow/blob/main/ci/scripts/python_wheel_xlinux_build.sh#L57, conda: https://github.com/apache/arrow/blob/main/ci/scripts/python_wheel_xlinux_build.sh#L57). That being said, as we are skipping all following lines (not a really good reproducible example) it makes sense to skip this one and the following too. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
