rajdchak opened a new pull request, #7738:
URL: https://github.com/apache/hadoop/pull/7738

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   
   This PR is for the integration of S3A with SSE_C support introduced in 
Analytics Accelerator library.
   
   
   ### How was this patch tested?
   Ran all the integrations tests in hadoop-aws on EC2 instance. 
   
   Setup:
   Created personal bucket in us-west-2 region. Ran the integrations tests in 
hadoop-aws with the configs mainly enabling Analytics stream and SSEC 
encryption.
   ```
   <configuration>
       <property>
           <name>test.fs.s3a.name</name>
           <value>s3a://rajdchak-s3a-integration/</value>
       </property>
   
       <property>
           <name>fs.contract.test.fs.s3a</name>
           <value>${test.fs.s3a.name}</value>
       </property>
   
       <property>
           <name>fs.s3a.endpoint.region</name>
           <value>us-west-2</value>
       </property>
   
       <property>
           <name>fs.s3a.access.key</name>
           <description>AWS access key ID. Omit for IAM role-based 
authentication.</description>
           <value> </value>
       </property>
   
       <property>
           <name>fs.s3a.secret.key</name>
           <description>AWS secret key. Omit for IAM role-based 
authentication.</description>
           <value> </value>
       </property>
   
       <property>
           <name>fs.s3a.session.token</name>
           <value>
              
           </value>
       </property>
   
       <property>
           <name>fs.s3a.aws.credentials.provider</name>
           <value>
               org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider
           </value>
       </property>
   
       <property>
           <name>fs.s3a.scale.test.enabled</name>
           <value>true</value>
       </property>
   
       <property>
           <name>fs.s3a.input.stream.type</name>
           <value>analytics</value>
       </property>
   
       <property>
           <name>fs.s3a.encryption.algorithm</name>
           <description>Specify a server-side encryption or client-side
               encryption algorithm for s3a: file system. Unset by default. It 
supports the
               following values: 'AES256' (for SSE-S3), 'SSE-KMS', 'SSE-C', and 
'CSE-KMS'
           </description>
           <value>SSE-C</value>
       </property>
   
       <property>
           <name>fs.s3a.encryption.key</name>
           <description>Specific encryption key to use if 
fs.s3a.encryption.algorithm
               has been set to 'SSE-KMS', 'SSE-C' or 'CSE-KMS'. In the case of 
SSE-C
               , the value of this property should be the Base64 encoded key. 
If you are
               using SSE-KMS and leave this property empty, you'll be using 
your default's
               S3 KMS key, otherwise you should set this property to the 
specific KMS key
               id. In case of 'CSE-KMS' this value needs to be the AWS-KMS Key 
ID
               generated from AWS console.
           </description>
           <value>AO8XKQXJgtIS9G+IrSWZ2eSNW1yJlvqElVoVcNlvDqE=</value>
       </property>
   
   
   </configuration>
   ```
   
   However unrelated to these changes whenever we enabled SSE_C there are 
multiple tests which fail with Bad Request exception as they are running on 
public buckets like ```noaa-cors-pds```. We should take an action item to 
disable those tests if SSE_C is enabled. Created an issue for this error 
https://issues.apache.org/jira/browse/HADOOP-19590 .
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [x] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to