[sqlalchemy] Problem with the UnicodeText type that stores data in an Oracle NCLOB column

2012-04-11 Thread Pierre Bossé
Hi everyone, We have a problem with the UnicodeText type that stores data in an Oracle NCLOB column. When we use a unicode string in python to write data in the table, we are limited to 4000 characters while it is possible to store a lot more characteres (6000 in the following example) using a

[sqlalchemy] Re: Elixir: Enum and accented characters (non-ASCII character)

2012-01-17 Thread Pierre Bossé
Thank you Michael for your answers, but This does not work even with the UFT-8 encoding. I changed my program with utf-8: = # -*- coding: utf-8 -*- from elixir import * from sqlalchemy import create_engine class TestEnum(Entity):

[sqlalchemy] Elixir: Enum and accented characters (non-ASCII character)

2012-01-16 Thread Pierre Bossé
Hello, Here is my problem: I would define a domain of values ​​for a table field. Some values ​​ have accented characters (non-ASCII character). When generating the DDL for Elixir, a conversion is made on the values ​​provided by the accented characters and no longer. Here is my test code: