Hi,

We were running some hive queries off data on amazon S3.In the hive script
file we are including the access key and secret access key as below.

set fs.s3.awsAccessKeyId=ABCD;
set fs.s3.awsSecretAccessKey=XYZ;
set fs.s3n.awsAccessKeyId=ABCD;
set fs.s3n.awsSecretAccessKey=XYZ;

and then we run the script using hive -f <<scriptfile>>

Everything was working fine until we changed our hive configuration to use
remote metastore (
http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/4.2.0/CDH4-Installation-Guide/cdh4ig_topic_18_4.html)
and also 'Bypass Hive Metastore Server' (
http://blog.cloudera.com/blog/2013/03/how-to-set-up-cloudera-manager-4-5-for-apache-hive/
)

Now when we run the same script we get the following error,even though the
script has set those values :
*
*
*FAILED: Error in metadata:
MetaException(message:java.lang.IllegalArgumentException: AWS Access Key ID
and Secret Access Key must be specified as the username or password
(respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or
fs.s3n.awsSecretAccessKey properties (respectively).)*
*FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask*


I do not want to set those values in the hive-site.xml as I may require to
point to different S3 buckets with different credentials.

Am I missing something in the configuration or in the script ?

--------------------------
Thanks & Regards
Himanish

Reply via email to