You can use export_to_csv_file, but I don't think you can do it from admin:
http://web2py.com/book/default/chapter/06#CSV-(all-tables-at-once)
Anthony
Is export_to_csv a preferred way of backup up database?
On Mar 21, 1:58 pm, Anthony wrote:
> You can use export_to_csv_file, but I don't think you can do it from
> admin:http://web2py.com/book/default/chapter/06#CSV-(all-tables-at-once)
>
> Anthony
I'd be curious to hear thoughts on that as well. I was contemplating what to
do when switching from SQLite to Postgres, but I'd also like to undertand
better what to do in production.
Each DB has its own methods. The integrity and consistency of
relational data is of crucial importance. I would only trust the
approved and recommended backup tool for the DB.
I normally write separate commandline scripts to do the backups which
run from cron during the night. The script also rsyn
I don't know if it was because I used the old SQLite version (Debian 5), but
SQLite constantly had a database lock problem. (You can Google this). And
when there's a database lock problem, you'll have to restart the server
essentially. Other than that, SQLite is very fast; a lot faster than
post
I moved from MySQL to PostgreSQL a few months ago with a small database but
with a significant number of many to many relations as well as one to many.
Here is what I did:
Exported the data from MySQL using the export_to_csv_file call from the
shell with the model activated which pushes all the
Great info, thanks!
On 22.3.2011 7:42, pbreit wrote:
Great info, thanks!
If you are moving from one database to another then you can have to open
database connection in web2py extract data and write it back to the
other database.
Kenneth
8 matches
Mail list logo