Is there a maximum /tmp tempfile size?

2010-10-06 Thread mdgbayly
I have a background job running as a worker that needs to generate and
write large files to s3.
From what I can tell s3 doesn't support chunked transfer encoding so I
need to know the size of the file before I can start writing it to s3.

One option is to write the file to memory before putting to s3 but as
these files could be quite bit, that could chew up a lot of memory.

The other option is to write it to a temp file under my application
root /tmp directory, and then upload the temp file to s3 from there.

I've read the info at http://docs.heroku.com/constraints#read-only-filesystem
and realize that the files won't hang around if my worker is stopped
restarted etc, but that's ok. I'd just be generating it, uploading to
s3 and then deleting the local copy.

I've also read that there is a 300MB hard memory cap for dynos?  Is
that true for workers too?

Cheers
Martin

-- 
You received this message because you are subscribed to the Google Groups 
Heroku group.
To post to this group, send email to her...@googlegroups.com.
To unsubscribe from this group, send email to 
heroku+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/heroku?hl=en.



Re: Is there a maximum /tmp tempfile size?

2010-10-06 Thread Oren Teich
The file size limit is in the many gigs range.  Clean up after
yourself and you shouldn't have any problems.

Workers are capped at the same memory limit.

Oren

On Tue, Oct 5, 2010 at 7:11 PM, mdgbayly martin.ba...@gmail.com wrote:
 I have a background job running as a worker that needs to generate and
 write large files to s3.
 From what I can tell s3 doesn't support chunked transfer encoding so I
 need to know the size of the file before I can start writing it to s3.

 One option is to write the file to memory before putting to s3 but as
 these files could be quite bit, that could chew up a lot of memory.

 The other option is to write it to a temp file under my application
 root /tmp directory, and then upload the temp file to s3 from there.

 I've read the info at http://docs.heroku.com/constraints#read-only-filesystem
 and realize that the files won't hang around if my worker is stopped
 restarted etc, but that's ok. I'd just be generating it, uploading to
 s3 and then deleting the local copy.

 I've also read that there is a 300MB hard memory cap for dynos?  Is
 that true for workers too?

 Cheers
 Martin

 --
 You received this message because you are subscribed to the Google Groups 
 Heroku group.
 To post to this group, send email to her...@googlegroups.com.
 To unsubscribe from this group, send email to 
 heroku+unsubscr...@googlegroups.com.
 For more options, visit this group at 
 http://groups.google.com/group/heroku?hl=en.



-- 
You received this message because you are subscribed to the Google Groups 
Heroku group.
To post to this group, send email to her...@googlegroups.com.
To unsubscribe from this group, send email to 
heroku+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/heroku?hl=en.