Hi, New to both AWS and Flink but currently have a need to write incoming data into a S3 bucket managed via AWS Tempory credentials.
I am unable to get this to work, but I am not entirely sure on the steps needed. I can write to S3 buckets that are not 'remote' and managed by STS tempory credentials fine. I am using flink 1.9.1, as this will when deployed live in EMR. My flink-conf.yml contains the following entries: fs.s3a.bucket.sky-rdk-telemetry.aws.credentials.provider: > org.apache.flink.fs.shaded.hadoop3.org.apache.hadoop.fs.s3a.auth.AssumedRoleCredentialProvider fs.s3a.bucket.sky-rdk-telemetry.assumed.role.credentials.provider: org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider fs.s3a.bucket.sky-rdk-telemetry.access-key: xxxxx fs.s3a.bucket.sky-rdk-telemetry.secret-key: xxxx fs.s3a.bucket.sky-rdk-telemetry.assumed.role.arn: xxxx fs.s3a.bucket.sky-rdk-telemetry.assumed.role.session.name: xxxx And my POM contains <dependencyManagement> <dependencies> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk-bom</artifactId> <version>1.11.700</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <dependency> <groupId>com.amazonaws</groupId> <artifactId>aws-java-sdk-sts</artifactId> <version>1.11.700</version> </dependency> I have put the jar flink-s3-fs-hadoop-1.9.1.jar into the plugins directory. Running my test Jar I am getting exceptions related to Class not found for org/apache/flink/fs/s3base/shaded/com/amazonaws/services/securitytoken/model/AWSSecurityTokenServiceException and poking around I see this is shaded into a package in Kinesis. I have added some rules to maven shade to rewrite the package as needed but this still doesn't help. Am I heading in the correct direction? Searching has not turned up much information that I have been able to make use of. Thanks for your time, J Sent with [ProtonMail](https://protonmail.com) Secure Email.