Patrick Scribus writes:
Hello,two of my computers have a similar role as desktop. The installed packages are nearly the same, the configuration is nearly the same and the stored data in /home also. Especially the texts, the pictures and the like require too much time and effort to keep in sync. At first I wrote a little script that uses the power of rsync. This is much better than no script at all. But I hope for a solution that automates this like in those talks from 10-15 years ago when they suggest to use coda. I would love to use coda but it seems like nobody is maintaining it since quite some time. What happened in the meantime? What do you guys use for similar tasks?
Hello, there have been multiple answers already, so forgive me if my post does not seem to add anything valuable. Still, it bugs me that there are many different solutions proposed without their advantages and disadvantages given? I think that it is an important factor how the systems "online status" (in sense of power and networking) is to be considered? Are both systems online simultaneously? Are both systems online at the same time only for synchonization? I can think of different solutions depending on what is actually wanted/neede: * Cluster File Systems. People have already mentioned ceph (which is more an object storage and thus slow on small files IIRC?). I can add OCFS (Oracle Cluster File System) to the list, although it is not so easy to set up. Cluster file systems make sense if both systems are online at the same time and should both access a common file system. Often, cluster file systems want a "third" machine for doing the actual storage work (e.g. an iSCISI target of OCFS). I have also tried out GFS2 in the past, but it is a PITA to set up! * Synchonization Tools. There are tools to invoke explicitly to call the synchronization. These make sense if both systems are online at the same time only for synchronization... if not, one will need to deal with "both changed" conflicts on a manual basis. I have no experience with syncthing (mentioned in the thread) -- syncthing might have a solution for this... * Network File Systems. If you have a constellation of: system1 and system2 where system2 online means system1 is online, too, then you might install a "file server" (NFS or similar) on system1 and share files through this mechanism. From all approaches proposed, I would recommend this as being the least complex in operation although this does not mean it is the least complex to setup. * "Cloud"-like file synchronization. These usually require a "third" server, too. And in my experience, whenever one is using "synchronized" files for non-trivial data processing (e.g. creating and reading a lot of files, storing a database, accessing the data with many processes...) most of these systems will fail one way or another (up to causing data loss). Yet, most people using such systems do not seem to have these issues :) These tools are useful in scenarios where there is no guarantee for any machine being online the same time as the other although this is achieved at the cost of running a "third" machine 24/7... HTH and YMMV Linux-Fan
pgpT5gAF_bQcc.pgp
Description: PGP signature