This is an automated email from the ASF dual-hosted git repository.
adutra pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/polaris.git
The following commit(s) were added to refs/heads/main by this push:
new a75229c7d Add missing region to MinIO getting-started example (#2411)
a75229c7d is described below
commit a75229c7db367cc0bb941680ecc43326a51fdd6d
Author: Alexandre Dutra <[email protected]>
AuthorDate: Wed Aug 20 17:03:36 2025 +0200
Add missing region to MinIO getting-started example (#2411)
The example was missing an AWS region, thus causing Spark to fail with:
```
spark-sql ()> create table ns.t1 as select 'abc';
25/08/20 16:25:06 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
software.amazon.awssdk.core.exception.SdkClientException: Unable to load
region from any of the providers in the chain
software.amazon.awssdk.regions.providers.DefaultAwsRegionProviderChain@47578c86:
[software.amazon.awssdk.regions.providers.SystemSettingsRegionProvider@1656f847:
Unable to load region from system settings. Region must be specified either
via environment variable (AWS_REGION) or system property (aws.region).,
software.amazon.awssdk.regions.providers.AwsProfileRegionPr [...]
...
at
org.apache.iceberg.aws.AwsClientFactories$DefaultAwsClientFactory.s3(AwsClientFactories.java:119)
at org.apache.iceberg.aws.s3.S3FileIO.client(S3FileIO.java:391)
at
org.apache.iceberg.aws.s3.S3FileIO.newOutputFile(S3FileIO.java:193)
```
---
getting-started/minio/README.md | 6 +++++-
1 file changed, 5 insertions(+), 1 deletion(-)
diff --git a/getting-started/minio/README.md b/getting-started/minio/README.md
index 5b4271458..65293c21b 100644
--- a/getting-started/minio/README.md
+++ b/getting-started/minio/README.md
@@ -56,11 +56,15 @@ bin/spark-sql \
--conf spark.sql.catalog.polaris.warehouse=quickstart_catalog \
--conf spark.sql.catalog.polaris.scope=PRINCIPAL_ROLE:ALL \
--conf
spark.sql.catalog.polaris.header.X-Iceberg-Access-Delegation=vended-credentials
\
- --conf spark.sql.catalog.polaris.credential=root:s3cr3t
+ --conf spark.sql.catalog.polaris.credential=root:s3cr3t \
+ --conf spark.sql.catalog.polaris.client.region=irrelevant
```
Note: `s3cr3t` is defined as the password for the `root` users in the
`docker-compose.yml` file.
+Note: The `client.region` configuration is required for the AWS S3 client to
work, but it is not used in this example
+since MinIO does not require a specific region.
+
## Running Queries
Run inside the Spark SQL shell: