mesejo commented on issue #442:
URL:
https://github.com/apache/arrow-datafusion-python/issues/442#issuecomment-1685809731
Hey! If all the files are under the same bucket, you could do the following:
```python
import os
import datafusion
from datafusion.object_store import AmazonS3
region = "us-east-1"
bucket_name = "yellow-trips"
s3 = AmazonS3(
bucket_name=bucket_name,
region=region,
access_key_id=os.getenv("AWS_ACCESS_KEY_ID"),
secret_access_key=os.getenv("AWS_SECRET_ACCESS_KEY"),
)
ctx = datafusion.SessionContext()
path = f"s3://{bucket_name}/"
ctx.register_object_store(path, s3)
ctx.register_parquet("trips", path)
df = ctx.sql("select count(passenger_count) from trips")
df.show()
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]