>> I have some long running processes that do very long simulations which >> at the end need to write things on a database. >> >> At the moment sometimes there are network problems and we end up with >> half the data on the database. >> >> The half-data problem is probably solved easily with sessions and >> sqlalchemy (a db-transaction), but still we would like to be able to >> keep a backup SQL file in case something goes badly wrong and we want to >> re-run it manually.. >> >> This might also be useful if we have to rollback the db for some reasons >> to a previous day and we don't want to re-run the simulations.. >> >> Anyone did something similar? >> It would be nice to do something like: >> >> with CachedDatabase('backup.sql'): >> # do all your things >
" ... at the end need to write things on a database ... " Is it necessary to write those things during the process, or only at the end? If only at the end, can you write locally first, and then write that local store to your remote database? -- http://mail.python.org/mailman/listinfo/python-list