Hi all,

I am currently running Bacula v5.0.3 in a Production environment. I have about 
25TB worth of data being backed up by it for a weeks' worth of Full & 
Incremental backups. I need to migrate an additional 65-70TB to it within the 
next few weeks. I currently keep 3 weeks' worth of backups, so I imagine you 
can see my problem now. I want to, if possible, be able to store at least 2 
weeks' worth of data for all of these boxes. I am not currently using any 
compression. I see that LZO compression is available as of 5.2.1, which would 
help, but I am unfamiliar with the ratio of compression LZO averages. 
Basically, if I can somehow get 2 weeks' worth of backup data (be about 180TB 
uncompressed/straight up) on a box that will have from 100-120TB available, 
that would be ideal. I am open to any & all suggestions to get this done, with 
Bacula. We are seriously considering an upgrade to at least 5.2.1, possibly 
5.2.6, which I'm sure would help. I have already reduced my data file (max 
volume size) to 15GB for my data pool of the smaller backups & 150GB for my 
data pool of the larger ones. Please let me know your thoughts & ideas on how 
to implement a solution for this. Thanks!
------------------------------------------------------------------------------
Virtualization & Cloud Management Using Capacity Planning
Cloud computing makes use of virtualization - but cloud computing 
also focuses on allowing computing to be delivered as a service.
http://www.accelacomm.com/jaw/sfnl/114/51521223/
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to