Hi, 1) how do you handle deduplication on the storage layer and the networking layer? (like, if user changes 2 random bytes in a 50MB file, or renames a file, what kind of network traffic does this cause, and what are the implications on storage consumption?) is it supported to version everything (i.e. keep x (or infinite) versions of all files?). it seems to me that using backends like ftp are very limiting factors because some protocols are really dumb wrt. efficiency. do you upload all files in small parts (how small?) to the ftp, as to minimize the needed syncing for minimal changes in a big file? do you encrypt small blocks of the file? because re-encrypting a file that has only changed a little will yield a completely different encrypted variant, or not? maybe you could use rolling checksums like rsync does (but even that is not ideal), AFAIK git actually has a pretty efficient blob storage and synchronisation system, you could put encrypted blobs in git's storage system and get some features for free. or camlistore... http://camlistore.org/ and what about when you keep, say 2GB in syncany (without modifying any file or doing anything special), will it cause an additional 2GB (or more) storage overhead, because it also needs to locally store the encryted variant of all files?
2) how do you handle sharing between several users? how about conflicts? are there means for manual and automatic conflict resolving? Dieter -- Mailing list: https://launchpad.net/~syncany-team Post to : [email protected] Unsubscribe : https://launchpad.net/~syncany-team More help : https://help.launchpad.net/ListHelp

