I have a 113656-byte pickle I'm trying to put into a blob column in a
way that will work for both SQLite and MySQL.  SQLite has no problem
with it, but in MySQL I have to use the MSMediumBlob type because it
exceeds 65536 bytes.  But I'd like the same table to work with both
engines.  Is this possible?

I'm using a CompressedPickle class that looks like this:

class CompressedPickle(sa.types.TypeDecorator):
    impl = sa.types.PickleType

    def process_bind_param(self, value, dialect):
        value = pickle.dumps(value, -1)
        value = zlib.compress(value, 9)
        return value

    def process_result_value(self, value, dialect):
        value = zlib.decompress(value)
        value = pickle.loads(value)
        return value

    def copy(self):
        return CompressedPickle(self.impl.length)




-- 
Mike Orr <[EMAIL PROTECTED]>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To post to this group, send email to sqlalchemy@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sqlalchemy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to