Re: [web2py] Re: Export DB from admin?

2011-03-22 Thread Kenneth Lundström

On 22.3.2011 7:42, pbreit wrote:
Great info, thanks! 


If you are moving from one database to another then you can have to open 
database connection in web2py extract data and write it back to the 
other database.



Kenneth



[web2py] Re: Export DB from admin?

2011-03-21 Thread Anthony
You can use export_to_csv_file, but I don't think you can do it from admin: 
http://web2py.com/book/default/chapter/06#CSV-(all-tables-at-once)
 
Anthony


[web2py] Re: Export DB from admin?

2011-03-21 Thread VP
Is  export_to_csv a preferred way of backup up database?


On Mar 21, 1:58 pm, Anthony abasta...@gmail.com wrote:
 You can use export_to_csv_file, but I don't think you can do it from 
 admin:http://web2py.com/book/default/chapter/06#CSV-(all-tables-at-once)

 Anthony


[web2py] Re: Export DB from admin?

2011-03-21 Thread pbreit
I'd be curious to hear thoughts on that as well. I was contemplating what to 
do when switching from SQLite to Postgres, but I'd also like to undertand 
better what to do in production.

[web2py] Re: Export DB from admin?

2011-03-21 Thread villas
Each DB has its own methods. The integrity and consistency of
relational data is of crucial importance. I would only trust the
approved and recommended backup tool for the DB.

I normally write separate commandline scripts to do the backups which
run from cron during the night. The script also rsyncs a copy to
another server.

I keep a daily backup for an entire month naming the files db_01,
db_02 ... db_31.  Then the next month will overwrite the previous
month.  In my case backups over a month old are pretty useless.

Clearly you will have to analyse your own requirements. Those scripts
can run for years without any maintenance so its worth a little
thought to get them right.


On Mar 21, 11:58 pm, pbreit pbreitenb...@gmail.com wrote:
 I'd be curious to hear thoughts on that as well. I was contemplating what to
 do when switching from SQLite to Postgres, but I'd also like to undertand
 better what to do in production.


[web2py] Re: Export DB from admin?

2011-03-21 Thread vtp2...@gmail.com
I don't know if it was because I used the old SQLite version (Debian 5), but
SQLite constantly had a database lock problem.  (You can Google this).  And
when there's a database lock problem, you'll have to restart the server
essentially.  Other than that, SQLite is very fast; a lot faster than
postgres.  I'd say if you don't have problem with SQLite locking, I would
not use postgres.

One thing I would advise is do not use id as the key because when you
switch database (let's say to insert entries from SQLite into a postgres
db), the ids might no longer be the same.  So if you identify entries using
id's in the old database, the entries might not be the same on the new one.


On Mon, Mar 21, 2011 at 6:58 PM, pbreit pbreitenb...@gmail.com wrote:

 I'd be curious to hear thoughts on that as well. I was contemplating what
 to do when switching from SQLite to Postgres, but I'd also like to undertand
 better what to do in production.


[web2py] Re: Export DB from admin?

2011-03-21 Thread ron_m
I moved from MySQL to PostgreSQL a few months ago with a small database but 
with a significant number of many to many relations as well as one to many. 
Here is what I did:

Exported the data from MySQL using the export_to_csv_file call from the 
shell with the model activated which pushes all the data out to one CSV.

Copied the application directory under web2py/applications, removed all the 
files under databases, changed the connection string in the model.

Created the database in PostgreSQL, left the DB empty.

Ran the application which ran the model and created all the tables and 
relations but no data.

Ran the import_from_csv_file in a shell with the model activated using the 
file from the export step. Don't forget the db.commit() at the end.

The import_from_csv_file does a fantastic job of mapping the old ids to new 
ids in the foreign key relations. I didn't find any breakage in the data. 
This allows the import to not use tricks like turning off the auto increment 
keys such as what happens when using a mysqldump to a file followed by 
reading it in with mysql. Any database that has been used for a while will 
usually have holes in the id sequence of records caused by deletion of 
records. The import_from_csv_file will pack all the ids into a continuous 
sequence and then fix the foreign key references by maintaining the old to 
new id mapping. If your application depends on knowing specific ids this is 
a problem but then I would argue this is an application design issue.

I tested the application and only found one DAL statement that worked in 
MySQL that would not in PostgreSQL. By changing the group by field in the 
query to the field on the other side of the join it worked in both versions.

By copying the application temporarily you are preserving the 
application/databases directory and since you are not touching the MySQL 
database exporting to CSV there isn't much danger in losing the data.

If the database were very large I am not sure how well this would work 
because the IDs from original to new database mapping must be kept in 
memory.

For regular backups by all means use the designated utility provided by the 
database.


[web2py] Re: Export DB from admin?

2011-03-21 Thread pbreit
Great info, thanks!