Hello Raphael,

Oh - a use case for django-transaction-signals 
<https://github.com/aaugustin/django-transaction-signals> ;-) I'm bringing up 
this elaborate joke because you're essentially asking for a "pre-commit" signal 
here and the README contains a good list of things that can go wrong with 
transaction signals. (Please ignore how this package demonstrates a way to do 
it as third-party code *cough* *cough* *cough*)

> I figured a perfect way to do this would be using `save()` or
> `post_save` to add the changed model instance to some kind of
> thread-local list, and then using `transaction.on_commit` to "schedule"
> the aggregation and create the log entries when all changes have been
> made. However, this obviously is not a good enough because `on_commit`
> runs *after* the `COMMIT` statement and thus we're not guaranteed that
> all log entries are persisted to the database.


In my opinion "saving the log entries may fail after a successful transaction" 
isn't the main design problem here. The bigger problem is "log may contain 
entries for writes that don't actually happen, because some savepoints were 
rolled back, typically due to atomic blocks exiting with an exception". And 
then you get dragged into the whole complexity that the README of 
django-transaction-signals outlines and that we're trying to avoid in Django.

(If you don't have any savepoint rollbacks, then your code sounds sufficiently 
constrained to implement logging of changes explicitly at the application layer 
rather than at the framework layer.)

If you run with ATOMIC_REQUESTS, I would suggest to replace it by a custom 
middleware that wraps get_response(request) in an atomic block. Then you know 
that this is the outermost traction and you can do whatever needed before 
exiting the atomic block. You also need the same in all management commands, 
which could be a problem if you depend on third-party management commands.

Failing that, in order to log changes with transactional correctness enforced 
by the ACID guarantees of PostgreSQL, I'd recommend doing it at that database 
level with triggers — which always execute in the same transaction. I realize 
it may be difficult to perform the kind of aggregation you have in mind.

As a last resort, I'd try a custom database backend to track accurately 
transactions and savepoints and maintain an up-to-date record of changes that 
will happen when the transaction is committed.

Hope this helps!

-- 
Aymeric.

-- 
You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-developers/E028A2F2-9E00-4599-B1F0-E1406F5810A6%40polytechnique.org.

Reply via email to