On Jan 3, 2012 4:55 AM, "James Broadhead" <jamesbroadh...@gmail.com> wrote: > > I have a pile of files, and a personal svn repo totalling around 13GiB > which I want to back up to cheaply to 'the cloud'. I would also like > it to be non-trivial for someone with access to the cloud servers to > decrypt my data. > > I have a 50GB free account for Box.net, but would consider others if > they have significant advantages. The box.net account is only allowed > upload files of max 100MiB at a time. > > Now one problem facing me is that most cloud services don't give > assurances of bit parity, so I'd like to be able to recover most of > the files if I lost my local copies and there were bits missing from > the uploaded backup. This makes the one-big-encrypted-file approach a > no-go. > > My current approach is to use split-tar, with the intention of > encrypting each file separately. (Is this worse / equivalent to having > one big file with ECB ? ) > http://www.informatik-vollmer.de/software/split-tar.php > ...but this seems to have difficulty sticking below the 100MiB > individual file limit (possibly there are too many large files in the > svn history). > > Any thoughts? I'm sure that many of you face this problem. >
Make tarball. Encrypt. Split using split. Protect with par2 using the -l option to limit size. Upload. Rgds,