Hello...

This is what I have now: postgresql 8.0.1 - database weights about 60GB and increases about 2GB per week. Nowadays I do backup every day - according to simple procedure (pg_start_backup:rsync data:pg_stop_backup:save wals produced during backup). On 1Gb internal network it usually takes me about 1h to perform this procedure.

But what if my database has ~200GB and more (I know this is a future :D)? From my point of view it won't be good idea to copy entire database to backup array. I would like to here opinions about this case - what do you propose? Maybe some of you already do something like this?

Regards,
Marcin Giedz

---------------------------(end of broadcast)---------------------------
TIP 2: Don't 'kill -9' the postmaster

Reply via email to