Hi Folks,

We have a requirement to deal with large databases of the size Terabytes 
when we go into production. What is the best database back-up mechanism 
and possible issues?

pg_dump can back-up database but the dump file is limited by OS file-size 
limit. What about the option of compressing the dump file? How much time 
does it generally take for large databases? I heard, that it would be way 
too long (even one or days). I haven't tried it out, though.

What about taking zipped back-up of the database directory? We tried this 
out but the checkpoint data in pg_xlogs directory is also being backed-up. 
Since these logs keeps on increasing from day1 of database creation, the 
back_up size if increasing drastically.
Can we back-up certain subdirectories without loss of information or 
consistency..?

Any quick comments/suggestions in this regard would be very helpful.

Thanks in advance,
Ravi Kumar Mandala

Reply via email to