http://www.jets3t.org/toolkit/configuration.html
On Jan 14, 2016 10:56 AM, "Alexander Pivovarov" <apivova...@gmail.com>
wrote:

> Add jets3t.properties file with s3service.s3-endpoint=<endpoint> to
> /etc/hadoop/conf folder
>
> The folder with the file should be in HADOOP_CLASSPATH
>
> JetS3t library which is used by hadoop is looking for this file.
> On Dec 22, 2015 12:39 PM, "Phillips, Caleb" <caleb.phill...@nrel.gov>
> wrote:
>
>> Hi All,
>>
>> New to this list. Looking for a bit of help:
>>
>> I'm having trouble connecting Hadoop to a S3-compatable (non AWS) object
>> store.
>>
>> This issue was discussed, but left unresolved, in this thread:
>>
>>
>> https://mail-archives.apache.org/mod_mbox/spark-user/201507.mbox/%3cca+0w_au5es_flugzmgwkkga3jya1asi3u+isjcuymfntvnk...@mail.gmail.com%3E
>>
>> And here, on Cloudera's forums (the second post is mine):
>>
>>
>> https://community.cloudera.com/t5/Data-Ingestion-Integration/fs-s3a-endpoint-ignored-in-hdfs-site-xml/m-p/33694#M1180
>>
>> I'm running Hadoop 2.6.3 with Java 1.8 (65) on a Linux host. Using
>> Hadoop, I'm able to connect to S3 on AWS, and e.g., list/put/get files.
>>
>> However, when I point the fs.s3a.endpoint configuration directive at my
>> non-AWS S3-Compatable object storage, it appears to still point at (and
>> authenticate against) AWS.
>>
>> I've checked and double-checked my credentials and configuration using
>> both Python's boto library and the s3cmd tool, both of which connect to
>> this non-AWS data store just fine.
>>
>> Any help would be much appreciated. Thanks!
>>
>> --
>> Caleb Phillips, Ph.D.
>> Data Scientist | Computational Science Center
>>
>> National Renewable Energy Laboratory (NREL)
>> 15013 Denver West Parkway | Golden, CO 80401
>> 303-275-4297 | caleb.phill...@nrel.gov
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org
>> For additional commands, e-mail: user-h...@hadoop.apache.org
>>
>>

Reply via email to