Philipp Thanks much for the syncany lesson! I like that it keeps internal chunk versions... makes it cheaper for me, and easier to handle on the client-side, presumably. If files are chunked and pushed to a remote server (ftp, etc.), are they stored on the filesystem there in their encrypted / chunked version? Presumably, to avoid a host system provider's prying eyes (eg. dropbox).
I take your point re. production-readiness of the syncany code. I'm playing with this knowing full well about the alpha nature of the code. That's not a problem. I realize that the 'normal' backup strategy is a scheduled thing, but for me that's just an artifact of my being (a) lazy and (b) still wanting backups (cron the script, easy peasy). A hot backup that continuously pushes changes would be remarkably cool. Maybe not an entire home-directory's worth of files, but the important bits in my case are pretty easily identified by their directories. Having syncany just work in the background pushing my changes over the wire without my having to think about it is a completely awesome idea. Thought-free / hassle-free backups and less work for me to administer are all good things in my book. If I get to sync to another computer, that's just icing on the cake. Forgive me if this is already written in the project, but are there plans to make a server-side way to peek at the file contents? I can see having some ec2/s3 combination with a little app that has the syncany bits and requires some administrator key configuration that offers the ability to 'share' or 'view history of this file' so I can get back the file that's 2 versions old I just accidentally deleted.. It would be dynamite to package up a self-contained deployable AMI that provided the server, storage and a web front end as a backup-in-a-box package. Most backup packages I've worked with (not many of them, I admit) are clunky and require lots of server-side configuration, ssh key sharing, or rsync. Which is fine for my linux boxen, but not the windows machines. While a standalone server image that can run anywhere and read the same backend (or even provide it) that syncany clients sync to would be useful to me, I imagine this space has been tackled before, but perhaps not with as little configuration as syncany seems to promise. Thanks again for the info - I'll look at building syncany with eyes open realizing what I'm getting in to. Mark On Mon, Jun 20, 2011 at 7:52 AM, Philipp Heckel <[email protected]> wrote: > Hello Mark, > >> I'm wrestling with the idea of using syncany to drive backups. I'd >> like to transition away from cron/rsync backup sets to a local server, >> and move to syncany pushing updates to an S3 bucket. > > First of, it's important to know that Syncany is very far from being > production ready. Please wait for a stable release before actually > dumping your old solution. Things are gonna change until then - big > time :-D > >> >From my meager understanding of S3, a resource can be soft-deleted if >> the client moves the resource to the Trash instead of issuing a rest >> Delete operation directly on the file. Is this something that the >> syncany-s3 plugin provides? > > Syncany has its own versioning mechanism so we don't need any extra > functionality from the storage. Files are chunked in little pieces and > as long as all chunks exist on the remote storage, we can reassemble > the old files. Syncany hence does not use (and doesnt need to) any > special features of S3. > >> If not, am I better off turning on S3's versioning feature >> (http://aws.amazon.com/s3/faqs/#What_is_Versioning) to have a complete >> change history of deleted stuff? > > S3 versioning would would only produce costs since it keeps storing > deleted files. So turining off means saving money :-) > By default, S3 versioning is turned off. > >> For what it's worth, I had thought of doing home directory sync via a >> git repo, but using syncany + S3 + S3-versioning may be the killer >> replacement and let me manage fewer servers myself. > > Syncany's backup functionalities are at the moment somewhat limited. > The first goal is to make a file synchronization tool (like Dropbox) > -- i.e. with live synchronization... Then, when that works, we can > implement an on-demand sync mechanism (= backup). > > Sorry to disappoint you, but syncing the home directory wont work, > since it must watch all directories and subdirectories... and Linux > would be unable to cope with so many watched directories. > > Cheers, > Philipp > -- Mailing list: https://launchpad.net/~syncany-team Post to : [email protected] Unsubscribe : https://launchpad.net/~syncany-team More help : https://help.launchpad.net/ListHelp

