No, you can either set the configurations within your SparkConf's hadoop
configuration:

  val hadoopConf = sparkContext.hadoopConfiguration
  hadoopConf.set("fs.s3n.awsAccessKeyId", s3Key)
  hadoopConf.set("fs.s3n.awsSecretAccessKey", s3Secret)


 or you can set it in the environment as:

export AWS_ACCESS_KEY_ID=<your access>
export AWS_SECRET_ACCESS_KEY=<your secret>


Thanks
Best Regards

On Mon, Sep 21, 2015 at 9:04 PM, Michel Lemay <mle...@gmail.com> wrote:

> Hi,
>
> It looks like spark does read AWS credentials from environment variable
> AWS_CREDENTIAL_FILE like awscli does.
>
>
> Mike
>
>

Reply via email to