Greetings everyone,
I have a piece of code in a web app where I need to store a large binary
file (uploaded file stored on disk by Apache server), into an object's
LargeBinary attribute.

That's pretty easy to do with a syntax like:

    myobject.uploaded_file = xyz.file.read()


The problem is that I don't want to load the entire file into memory
when I set the LargeBinary attribute.



If my understanding is correct, the above call will first cause the
entire content of the uploaded file to be loaded into memory and then
that is assigned to the myobject.uploaded_file LargeBinary attribute. 
Correct?

(Then when sqlalchemy eventually issues the INSERT statement to store
the data in the database... But then I don't really know how the data
transfer is done...)


I have tried to find another way of passing the data to the LargeBinary
object that would not have to load the entire file into memory at once,
but stream it in chunks during the INSERT statement, but I was not able
to find anything. :-(


Anyone managed to implement something like this before, or know where I
can read some more info about possible ways of doing this with sqlalchemy?


Thanks a lot,
  Andre



-- 
André Charbonneau
Research Computing Support Analyst
Shared Services Canada | National Research Council Canada
Services partagés Canada | Conseil national de recherches Canada
100 Sussex Drive | 100, promenade Sussex 
Ottawa, Ontario  K1A 0R6
Canada
andre.charbonn...@ssc-spc.gc.ca
Telephone | Téléphone:  613-993-3129

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to