I am using Django as a content admin backend for a desktop
application. The desktop application needs to cache the Django DB
structure and uploaded files directory from the remote server running
Django. This is accomplished with an "export" django app I wrote which
serves up the DB as JSON (using code copy-pasted from the dumpdata
command) and using rsync over ssh to sync the uploaded files directory
to a local location.

The problem is that the state of the DB & uploaded files directory
could change after the JSON is dumped and before the "rsync" finishes.
Maybe it could even change while the JSON is dumping?

I could solve this issue with some sort of distributed locking
mechanism, but I'm wondering if there is a better Django-specific way
to do it? Some sort of DB+files export transaction? Actually, maybe
just DB consistency is the real issue, because worst case with the
files is that I pull down new files not referenced by the seconds-
older JSON?

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to