Like all CRUD goes, I need to write some data to a table. when I write new
data to the table, everything works like charm. the problem starts when I
need to write data already existing in the table (actually updating some
data with the same primary key).
the data just doesn't seem to be written
one thing to note is that deepcopy() is not going to work. It will copy
SQLAlchemy's own accounting information on the object as well and generally
cause confusion.
The easiest way to insert a lot of data while detecting dupes efficiently is to
sort the data, then chunk through it, and for
Thanks Michael for the good advice.
since I don't this chunking solution won't work for this specific use case
(The keys would be hard to sort) would't it be an easier solution just to
move transaction.commit() after each flush, so the DBSession.rollback()
wouldn't lose existing data in the
if you want to go that approach I suggest you use begin_nested() which will
produce a SAVEPOINT, local to a certain scope within the transaction. you'll
have better results with 0.8 using this approach.
On Mar 18, 2013, at 1:54 PM, alonn alonis...@gmail.com wrote:
Thanks Michael for the