For the moment, I am running two cron jobs. One will execute web2py
and generate a CSV file of the database, tar it up. The other cron
will run an hour later, and will use PGDUMP and tar the sql statements
up. It then uses SSH to ship the data to our backup file server.

For the moment this seems to be the best solution, considering both
backups combined calculate to only 800kb, as this size increases I
will most likely stick with one or the other.

This has two advantages, one I have access to the data in a web2py
format, that I can install on my development machine running sqlite,
so I can test with real data. The other advantage is I still have a
native database backup, in case the server crashes.

-Thadeus





On Tue, Dec 8, 2009 at 5:28 PM, Yarko Tymciurak
<resultsinsoftw...@gmail.com> wrote:
> this is preferable / easier in the bigger scheme of things,
> than thinking about cvs / sqlite (blech! spit!  ar

--

You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web...@googlegroups.com.
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.


Reply via email to