Lio,

there are potentially 2 problems here:
 - the request thread is using too much memory/has a memory lead
 - the size of the file is too large.

i know you are having the second problem, not sure about the first.  it's 
possible that the second problem is being reported to you with a 
mis-leading message.

as for the file size, i am assuming that you are trying to store the file 
directly on a row in the table.  Remember that on GAE, when using the 
datastore, each row is limited to 1MB total, and any given field is limited 
to 1MB (which you can only achieve if the row contains 1 field).

To grow past this limitation, look into the GAE blobstore for storing large 
files.  you also can write tools that upload the file to some other 
location (google cloud storage, amazon S3) and store the link to the file 
in your table.

hope that helps!

cfh

On Tuesday, April 16, 2013 7:25:16 PM UTC-7, Lio wrote:
>
> Hello,
>
> Today the first time I got 500 Server Error when trying to upload a file 
> of 1.56MB as a field of a record in a table. When checking GAE logs it 
> shows "*Exceeded soft private memory limit with 20x.xxx MB after 
> servicing XXX requests total*" every time I tried to upload the same 
> file. 
>
> I've done the same operation several time without any problem, I guess 
> it's very likely caused by the file size, so I reduced the file size to 
> less than 600K then the uploading succeeded. 
>
> My question is what is the limit of upload file size and also in case I 
> really need to upload a big file what could be the solution?
>
> Thanks for advice.
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to