You could consider another option: To have a robust script (daemon?) in 
each web server that periodically pushes metrics data into the master. I 
think this gives you better control over your data. If you lose connection 
or something goes wrong your web servers will be able to tell what DIDN'T 
make the transfer better than the master could. Also, considering the 
volume of information your master server will be asked to handle, it may 
be better to NOT require it to run 12 additional polling daemons (or 
however they will be written) on top of doing everything else. 

My 2 cents,
Shawn Green
Database Administrator
Unimin Corporation - Spruce Pine

Marc Knoop <[EMAIL PROTECTED]> wrote on 07/23/2004 02:41:38 PM:

> Greetings! 
> 
> I have several (~12) web servers which all record web metrics to their 
own 
> local mysql database.  I would like to consolidate data from each web 
server 
> database to one master DB to within the hour.  I wish to avoid running 
> multiple instances of mysql on the "master" server, so replication is 
not an 
> option. 
> 
> What are the best practices for managing the consolidation of data?  Is 
it 
> best to export the data on each web server and perform frequent bulk 
loads 
> on the "master" server?  Or, is it better to have a robust Perl script 
on 
> the "master" server that is responsible for pulling records from each 
web 
> server?  I estimate 10,000 to 30,000 records per web server, per day 
with 
> the average row size of 100 Bytes.  The web servers are all in remote 
> locations. 
> 
> The end goal is to have all web metrics available on *one* server from 
which 
> a reporting server (M$ SQL server). 
> 
> Lastly, are there any experts on this list willing and available to code 
and 
> document this, given more details, of course? 
> 
>  --
> ../mk 
> 
> -- 
> MySQL General Mailing List
> For list archives: http://lists.mysql.com/mysql
> To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]
> 

Reply via email to