On Dec 8, 7:39 pm, Thadeus Burgess <thade...@thadeusb.com> wrote:
> I am thinking that if the database on 1 goes down, a sqlite can be set
> up fairly quickly if csv is available.

Although I appreciate the idea that the backup data could be
transportable across different DBs, it still seems a trivial matter to
restore the original DB.  Otherwise any triggers, SPs etc would be
missing.  Therefore, apart from maybe Sqlite, this is not so much a
reliable backup but more of a migration tool.

When a machine goes down, it is that day's latest data that is lost.
That's why I suggested a simple replication idea.  Or, at least some
way of regularly (every few minutes) copying new and updated records
to another machine.  If not replication, then a log.  Perhaps an
extension of Massimo's audit trail slice. For free reliable storage,
the log could perhaps be emailed to a Gmail account.

Of course, if we had DB backups, Web2py DB-agnostic backup, and either
simple replication or an audit trail log, we've got everything
available to rebuild our data.  Of course we might still lose the last
few minutes,  but anyone who has data crucially important will already
have serious raid and replication services which perhaps makes this
discussion irrelevant for them.

D

--

You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web...@googlegroups.com.
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.


Reply via email to