On 2014-12-04 08:23, Staszek wrote:
On 2014-11-29 16:19, Michael Bayer wrote:
SQLAlchemy shouldn’t be attempting to run this decode operation unless the
MySQL driver used here is acting in a flaky way, that is, SQLAlchemy did a
test on first connect to see if a Unicode() type comes back as
Hello.
The following code crashes:
# db init...
meta = MetaData()
foo = Table('tmp_foo' meta,
Column('id', Integer, primary_key=True),
prefixes=['TEMPORARY'],
)
conn = session.connection()
foo.create(conn, checkfirst=True)
foo.create(conn, checkfirst=True)
This is because the
On Dec 4, 2014, at 9:36 AM, Ladislav Lenart lenart...@volny.cz wrote:
Hello.
The following code crashes:
# db init...
meta = MetaData()
foo = Table('tmp_foo' meta,
Column('id', Integer, primary_key=True),
prefixes=['TEMPORARY'],
)
conn = session.connection()
On Dec 4, 2014, at 3:46 AM, Staszek stf.list.ot...@eisenbits.com wrote:
On 2014-12-04 08:23, Staszek wrote:
On 2014-11-29 16:19, Michael Bayer wrote:
SQLAlchemy shouldn’t be attempting to run this decode operation unless the
MySQL driver used here is acting in a flaky way, that is,
Wow! Thank you!
I guess this is a near-light-speed support in practice! :-)
I stumbled upon this issue while I was trying to figure out how to work with
temporary tables in SQLAlchemy. Final version of my code does not use the
checkfirst flag at all, because I know when to create and when to
Hello.
How can I specify ON COMMIT... for a TEMP table in SA?
I.e. the following Python code
meta = MetaData()
foo = Table('tmp_foo' meta,
Column('id', Integer, primary_key=True),
Column('val', Integer, nullable=False),
prefixes=['TEMPORARY'],
)
conn = session.connection()
Hello all,
We are facing a problem when using history_meta.py recipe.
It seems like two concurrent transactions read the same (id,version) tuple
at #1 and then each one tries to insert a new row into the pbases_history
table with the same pk (id, version) combination (see #2)
By removing #2,
On Dec 4, 2014, at 6:36 PM, HP3 henddher.pedr...@gmail.com wrote:
Hello all,
We are facing a problem when using history_meta.py recipe.
It seems like two concurrent transactions read the same (id,version) tuple at
#1 and then each one tries to insert a new row into the pbases_history
Hello,
I'm trying to build a friendship self referential User-User relationship
query using a join table, but I'd like to also be able to have access a
column/attribute on the join table in the query result.
For reference, my setup is very similar to the one seen at
Thank you very much Mike
We just tried this:
session.query(obj.__class__.version).with_for_update().filter(obj.__class__.id
== obj.id).one()
attr['version'] = obj.version
...
But the end result was the same:
IntegrityError: (IntegrityError) duplicate key value violates unique
2014-12-04 19:06:16,938 INFO [root][MainThread]
http://localhost:6543/resources/pages/0001---0002-0001/annotations
2014-12-04 19:06:16,938 INFO [root][MainThread]
http://localhost:6543/resources/pages/0001---0002-0001/annotations
2014-12-04
This is a simplified example of my issue. I wrote a detailed example, but
it might be too confusing.
There are three classes:
Entity
only has one of user_id or username
id
user_id
username
profile = relationship( EntityProfile based on user_id )
Hi,
We were having a similar problem with history_meta, so we created
history_meta_date (.py file attached). In our version, we use timestamps
rather than sequence numbers to track history versions and don't need to
worry about duplicate key problems. The version_date timestamps still
provide
13 matches
Mail list logo