If you don't want to configure hadoop on your mesos slaves, the only
workaround I see is to write a "hadoop" script and put it in your PATH. It
need to support the following usage patterns:

- hadoop version
- hadoop fs -copyToLocal s3n://path /target/directory/

On Sat, Feb 27, 2016 at 12:31 AM, Aaron Carey <aca...@ilm.com> wrote:

> I was trying to avoid generating urls for everything as this will
> complicate things a lot.
>
> Is there a straight forward way to get the fetcher to do it directly?
>
> ------------------------------
> *From:* haosdent [haosd...@gmail.com]
> *Sent:* 26 February 2016 16:27
> *To:* user
> *Subject:* Re: Downloading s3 uris
>
> I think still could pass AWSAccessKeyId if it is private?
> http://www.bucketexplorer.com/documentation/amazon-s3--how-to-generate-url-for-amazon-s3-files.html
>
> On Sat, Feb 27, 2016 at 12:25 AM, Abhishek Amralkar <
> abhishek.amral...@talentica.com> wrote:
>
>> In that case do we need to keep bucket/files public?
>>
>> -Abhishek
>>
>> From: Zhitao Li <zhi...@uber.com>
>> Reply-To: "user@mesos.apache.org" <user@mesos.apache.org>
>> Date: Friday, 26 February 2016 at 8:23 AM
>> To: "user@mesos.apache.org" <user@mesos.apache.org>
>> Subject: Re: Downloading s3 uris
>>
>> Haven't directly used s3 download, but I think a workaround (if you don't
>> care ACL about the files) is to use http
>> <http://stackoverflow.com/questions/18239567/how-can-i-download-a-file-from-an-s3-bucket-with-wget>
>>  url
>> instead.
>>
>> On Feb 26, 2016, at 8:17 AM, Aaron Carey <aca...@ilm.com> wrote:
>>
>> I'm attempting to fetch files from s3 uris in mesos, but we're not using
>> hdfs in our cluster... however I believe I need the client installed.
>>
>> Is it possible to just have the client running without a full hdfs setup?
>>
>> I haven't been able to find much information in the docs, could someone
>> point me in the right direction?
>>
>> Thanks!
>>
>> Aaron
>>
>>
>>
>
>
> --
> Best Regards,
> Haosdent Huang
>

Reply via email to