In article <4bea6b50$0$8925$426a7...@news.free.fr>,
News123  <news1...@free.fr> wrote:
>
>I'd like to perform huge file uploads via https.
>I'd like to make sure,
>- that I can obtain upload progress info (sometimes the nw is very slow)
>- that (if the file exceeds a certain size) I don't have to
>  read the entire file into RAM.

Based on my experience with this, you really need to send multiple
requests (i.e. "chunking").  There are ways around this (you can look
into curl's resumable uploads), but you will need to maintain state no
matter what, and I think that chunking is the best/simplest.
-- 
Aahz (a...@pythoncraft.com)           <*>         http://www.pythoncraft.com/

f u cn rd ths, u cn gt a gd jb n nx prgrmmng.
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to