Yeah, I've managed to find the sandbox itself on disk, but it's empty, even 
though the file shows up in the web UI...

My task is a docker container and it doesn't show up in the container either

Any ideas?

Thanks!
Aaron

________________________________
From: Joseph Wu [jos...@mesosphere.io]
Sent: 26 February 2016 18:27
To: user@mesos.apache.org
Subject: Re: Downloading s3 uris

The sandbox directory structure is a bit deep...  See the "Where is the 
sandbox?" section here: http://mesos.apache.org/documentation/latest/sandbox/


On Fri, Feb 26, 2016 at 10:15 AM, Aaron Carey 
<aca...@ilm.com<mailto:aca...@ilm.com>> wrote:
A second question for you all..

I'm testing http uri downloads, and all the logs say that the file has 
downloaded (it even shows up in the mesos UI in the sandbox) but I can't find 
the file on disk anywhere. It doesn't appear in the docker container I'm 
running either (shouldn't it be in /mnt/mesos/sandbox?)

Am I missing something here?

Thanks for your help,

Aaron


________________________________
From: Radoslaw Gruchalski [ra...@gruchalski.com<mailto:ra...@gruchalski.com>]
Sent: 26 February 2016 17:41

To: user@mesos.apache.org<mailto:user@mesos.apache.org>; 
user@mesos.apache.org<mailto:user@mesos.apache.org>
Subject: Re: Downloading s3 uris

Just keep in mind that every execution of such command starts a jvm and is, 
generally, heavyweight. Use WebHDFS if you can.

Sent from Outlook Mobile<https://aka.ms/qtex0l>




On Fri, Feb 26, 2016 at 9:13 AM -0800, "Shuai Lin" 
<linshuai2...@gmail.com<mailto:linshuai2...@gmail.com>> wrote:

If you don't want to configure hadoop on your mesos slaves, the only workaround 
I see is to write a "hadoop" script and put it in your PATH. It need to support 
the following usage patterns:

- hadoop version
- hadoop fs -copyToLocal s3n://path /target/directory/

On Sat, Feb 27, 2016 at 12:31 AM, Aaron Carey 
<aca...@ilm.com<mailto:aca...@ilm.com>> wrote:
I was trying to avoid generating urls for everything as this will complicate 
things a lot.

Is there a straight forward way to get the fetcher to do it directly?

________________________________
From: haosdent [haosd...@gmail.com<mailto:haosd...@gmail.com>]
Sent: 26 February 2016 16:27
To: user
Subject: Re: Downloading s3 uris

I think still could pass AWSAccessKeyId if it is private? 
http://www.bucketexplorer.com/documentation/amazon-s3--how-to-generate-url-for-amazon-s3-files.html

On Sat, Feb 27, 2016 at 12:25 AM, Abhishek Amralkar 
<abhishek.amral...@talentica.com<mailto:abhishek.amral...@talentica.com>> wrote:
In that case do we need to keep bucket/files public?

-Abhishek

From: Zhitao Li <zhi...@uber.com<mailto:zhi...@uber.com>>
Reply-To: "user@mesos.apache.org<mailto:user@mesos.apache.org>" 
<user@mesos.apache.org<mailto:user@mesos.apache.org>>
Date: Friday, 26 February 2016 at 8:23 AM
To: "user@mesos.apache.org<mailto:user@mesos.apache.org>" 
<user@mesos.apache.org<mailto:user@mesos.apache.org>>
Subject: Re: Downloading s3 uris

Haven't directly used s3 download, but I think a workaround (if you don't care 
ACL about the files) is to use 
http<http://stackoverflow.com/questions/18239567/how-can-i-download-a-file-from-an-s3-bucket-with-wget>
 url instead.
On Feb 26, 2016, at 8:17 AM, Aaron Carey 
<aca...@ilm.com<mailto:aca...@ilm.com>> wrote:

I'm attempting to fetch files from s3 uris in mesos, but we're not using hdfs 
in our cluster... however I believe I need the client installed.

Is it possible to just have the client running without a full hdfs setup?

I haven't been able to find much information in the docs, could someone point 
me in the right direction?

Thanks!

Aaron




--
Best Regards,
Haosdent Huang


Reply via email to