Hello,

I am replicating a fairly large database (~100 million
rows in one table, other tables hundreds to thousands
of entries). Around 30,000 rows are added to the large
table every hour. 

Are there any general slony maintenance guidelines for
such a scenario? For example, I noticed last night
that sl_log_1 was very big (~25 million entries). Does
slony take care of these things, or do I need to do
anything?

Also, what is a good value for "g", "o", and "c"
parameters for this scenario?

Thanks,

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 
_______________________________________________
Slony1-general mailing list
[email protected]
http://gborg.postgresql.org/mailman/listinfo/slony1-general

Reply via email to