Ah, alright then, that looks like that's the case. Thank you for the info.
I'm probably going to try to use the s3 managed encryption, from what I
read this is supported by setting fs.s3a.server-side-encryption-algorithm
parameter.

Thanks!
Nisrina

On Tue, Jan 26, 2016 at 11:55 PM, Ewan Leith <ewan.le...@realitymine.com>
wrote:

> Hi Nisrina, I’m not aware of any support for KMS keys in s3n, s3a or the
> EMR specific EMRFS s3 driver.
>
>
>
> If you’re using EMRFS with Amazon’s EMR, you can use KMS keys with
> client-side encryption
>
>
>
>
> http://docs.aws.amazon.com/kms/latest/developerguide/services-emr.html#emrfs-encrypt
>
>
>
> If this has changed, I’d love to know, but I’m pretty sure it hasn’t.
>
>
>
> The alternative is to write to HDFS, then copy the data across in bulk.
>
>
>
> Thanks,
>
> Ewan
>
>
>
>
>
>
>
> *From:* Nisrina Luthfiyati [mailto:nisrina.luthfiy...@gmail.com]
> *Sent:* 26 January 2016 10:42
> *To:* user <user@spark.apache.org>
> *Subject:* Write to S3 with server side encryption in KMS mode
>
>
>
> Hi all,
>
>
>
> I'm trying to save a spark application output to a bucket in S3. The data
> is supposed to be encrypted with S3's server side encryption using KMS
> mode, which typically (using java api/cli) would require us to pass the
> sse-kms key when writing the data. I currently have not found a way to do
> this using spark hadoop config. Would anyone have any idea how this can be
> done or whether this is possible?
>
>
>
> Thanks,
>
> Nisrina.
>
>
>



-- 
Nisrina Luthfiyati - Ilmu Komputer Fasilkom UI 2010
http://www.facebook.com/nisrina.luthfiyati
http://id.linkedin.com/in/nisrina

Reply via email to