On Wed, 9 Feb 2005 13:10:16 +0200, Paul Godard <[EMAIL PROTECTED]> wrote:
> Hi
> 
> I use mysql/php on several client web sites (each client has its own
> db).  For development purpose, I have a local web server with
> php/mysql.  When a web site is still in development, this is the
> master db, even if a copy has been uploaded on the live web server at
> my isp.  When a site is completed, the web db becomes the master (as
> clients and visitors are updating data on the web db).
> 
> My problem is that I would like to keep the local db synchronized
> with the web db (at least for the tables that are regularly updated
> on the web).  Manually I would do a dump (data export only) of these
> web tables onto a local file, then empty the local equivalent tables
> and then running the sql statements of the dump file (insert).
> 
> I don't do a systematic download of the whole web db as it is
> sometimes too big and besides not all the tables are updated by the
> client/visitors.
> 
> My idea is to dynamically build a mysql script via php that can do
> the job for each client web sites for each of the tables contained in
> an array and be executed from a simple click on a button on the local
> server of course.
> 
> As I am not a mysql/php guru, I would appreciate suggestions to do
> this as simple as possible.
> 

Have you thought about using mysqldump piped to the local DB?

For example:

mysqldump -h server_name_or_ip -u username -ppassword db_name table1
[table2 ...] > mysql -u localuser -p db_name

This is (I believe) shown in the on-line manual under mysqldump.

It fulfills my needs. You could even set this up in cron (scheduled tasks).

HTH

Coz

-- 
CozWeb Solutions Ltd
http://www.cozweb.net

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to