First off, writing the blob.  From what I gather on the internet, I'm
suppose to read the entire file into memory (a Python string), then create
a dbiRaw object with that string and use the dbiRaw object in an insert
statement?

That doesn't sound very efficient to me.  What if my computer only has 64 MB
of memory and the data I want to insert is 128 MB?

It seems like there should be a way to read part of the data, insert that
part into the DB, then get the next part, append, etc.

Also, what if you are inserting a huge piece of data, like a GB, and you
want to have a progress bar in your UI?  How would you do that if you
couldn't split up the insert statements?

Thanks for the help.

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to