Hi all,

I am trying out Spark 3.2.1 on k8s using Hadoop 3.3.1
Running into issues with writing to s3 bucket using
TemporaryAWSCredentialsProvider
https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/index.html#Using_Session_Credentials_with_TemporaryAWSCredentialsProvider

While reading from s3 works, I am getting error 403 access denied while
writing to the KMS enabled bucket.

I am wondering if I am missing some dependency jars or client configuration
properties.
I would Appreciate your help if someone can give me a few pointers on this.

Regards,
Prasad Paravatha

Reply via email to