Chris Angelico wrote: > Sounds like a lot of hassle, and a lot of things that could be done > wrongly. Personally, if I need that level of reliability and > atomicity, I'd rather push the whole question down to a lower level: > maybe commit something to a git repository and push it to a remote > server,
How is pushing something to a remote server more reliable than writing to a local drive? > or use a PostgreSQL database, or something of that sort. Let > someone else have the headaches about "what if AV opens the file". Let > someone else worry about how to cope with power failures at arbitrary > points in the code. That "someone else" is still you though. And "use a PostgreSQL database" is no solution to the problem "how do I reliably write a 250MB video file?". Or for that matter, a 2KB text file. Databases are great as databases, but they aren't files. > (Though, to be fair, using git for this doesn't > fully automate failure handling; what it does is allow you to detect > issues on startup, and either roll back ("git checkout -f") or apply > ("git commit -a") if it looks okay.) I don't see how this sort of manual handling is less trouble than using a proper, reliable save routine. To start with, what do you mean "detect issues on startup" -- startup of what? -- Steven -- https://mail.python.org/mailman/listinfo/python-list