On 08/11/2016 03:30 AM, Wibowo Arindrarto wrote:
Dear all,

I have a little problem with getting object states that have been
flushed but not committed. As far as I understand, SQLAlchemy does not
have an official object state corresponding to this.

The Session is intended to only be working with objects within a transaction. Outside of a transaction, there is nothing. Therefore there's no such state as "this object exists outside of the transaction", because in fact we have no idea. It could have been deleted since we last loaded it. We can't get to that state unless we started a new transaction.

Now I know what someone might be getting at with this, that is, they want to access the object concurrently from some other transaction, assuming isolation levels are compatible with this. But those patterns are outside of the Session's built-in functionality, so your approach of using the events to do what you want is fine.


 However, in my
application I need to track such objects. My current approach is to
listen to the ``after_flush`` session event, similar to the one outlined
here <https://groups.google.com/forum/#!topic/sqlalchemy/eGxpQBChXQw>.

sounds great.

However, that gets messy quite quickly if I want to track different
instances from different tables as the approach uses a global
db.session.info state.

organize the objects into a dictionary that is keyed to their class and the value is a further collection of the objects themselves. Or just key them to (obj.__class__, obj.id) in a single dictionary.



For the record, I'm also open to suggestions / critiques on whether
tracking flushed but not committed objects are a good idea. The reason I
am doing this is because I need to do something like ``get_or_create``
(get a database record if present, create it otherwise). But I'm doing
this multiple times, and some objects may be a container of multiple
other objects. To make the ``get_or_create`` query of these container
objects simpler, I store the hash of the object IDs that it contains.
Doing so means that I need to have all the contained object IDs, which
means I have to at least flush the contained objects first before I can
run ``get_or_create`` on the container object.

this sounds like you need the primary keys, which you get from flush(). That's fine but has nothing to do with "committed".



Now, my question is, is it possible to extend my models such that I can
track the commit status of it? If so, how do I do it? If not, is there a
better way to tackle my underlying problem (i.e. doing ``get_or_create``
on a container object)? I am aware of the ``persistent`` state, but it
does not make the distinction on the commit status of an object.

Like other replies here I'm not understanding the link between "commit" and "get_or_create". Having a dictionary of objects that already exists is a fine pattern to use for get_or_create but if you have the object and a primary key, it's in. If this is a collection that is used concurrently by many sessions, then I'd build a session-local copy of your object map and then push it out to the "global" one within the after_commit() event.





Thank you in advance,
Bow

--
You received this message because you are subscribed to the Google
Groups "sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send
an email to sqlalchemy+unsubscr...@googlegroups.com
<mailto:sqlalchemy+unsubscr...@googlegroups.com>.
To post to this group, send email to sqlalchemy@googlegroups.com
<mailto:sqlalchemy@googlegroups.com>.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to