Hello, 2 year Bacula user but first-time poster. I'm currently dumping about
1.6TB to LTO2 tapes every week and I'm looking to migrate to a new storage
medium.
The obvious answer, I think, is a direct-attached disk array (which I would be
able to put in a remote gigabit-attached datacenter before too long). However,
I'm wondering if anyone is currently doing large (or what seem to me to be
large) backups to the cloud in some way? Assuming I have a gigabit connection
to the Internet from my datacenter, I'm wondering how feasible it would be to
either use something like Amazon S3 with s3fs (I'm guessing way too much
overhead to be efficient), or a bacula-SD implementation on an EC2 node, using
Elastic Block Store (EBS) as "local" disk, and VPN (Amazon VPC) between my
datacenter and the SD.
Substitute your favorite cloud provider for Amazon above; I don't use any right
now so not tied to any particular provider. It just seems like Amazon has all
the necessary pieces today.
To do this, and keep customers comfortable with the idea of data in the cloud,
we would need to encrypt, so I'm also wondering if it would be possible for the
SD to encrypt the backup volume, rather than the FD encrypt the data before
sending it to SD (which is what we do now)? Easier to manage if we just
handled encryption in one place for all clients.
I would love to hear what other people are either doing with Bacula and the
cloud, or why you have decided not to.
Thanks
Peter Zenge
Pzenge .at. ilinc .dot. com
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users