:1.8.0_171]
> at
> com.amazonaws.internal.ConnectionUtils.connectToEndpoint(ConnectionUtils.java:52)
> ~[blob_p-575afa7acc2fe3049b65534303a189df3afe9895-6c71352c89388f6a3754b9b72482e6d2:?]
> at
> com.amazonaws.internal.EC2ResourceFetcher.doReadResource(EC2ResourceFetcher.java:80)
> ~[blob_p-575afa7acc2fe3
>
>>
>> Good luck :)
>>
>> Svend
>>
>>
>>
>> [1]
>> https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/index.html
>> [2]
>> https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/assumed_roles.h
Hello,
Trying to read a parquet file located in S3 leads to a AWS credentials
exception. Switching to other format (raw, for example) works ok regarding
to file access.
This is a snippet of code to reproduce the issue:
static void parquetS3Error() {
EnvironmentSettings settings =
th the configuration, and found that the exception
> is thrown if there is no s3a.access-key or s3a.secret-key
> configured. Could you have a look at if the two configuration items
> are effective ?
>
> Also I only configured the s3a.path-style: true, s3a.access-key and
> s3a.secret-key, is
Hi,
I'm trying to read from and write to S3 with Flink 1.12.2. I'm submitting
the job to local cluster (tar.gz distribution). I do not have a Hadoop
installation running in the same machine. S3 (not Amazon) is running in a
remote location and I have access to it via endpoint and access/secret