I imagine we can save some space on our file server by cleaning up all the files that are saved multiple times by different people. There is already the fdupes command in linux that will scan a directory tree and report what files have duplicates. This could be easily scripted to turn those duplicate files into symlinks to one file.
The problem is see, then, is what would happen if someone tries to change a duplicate file that they think is their own copy. Of course, everyone with a symlink to that file would get the changes, which is not what I would want. What it would need is some sort of copy-on-edit mechanism, so when the file is changed, instead of changing the original file, the symlink is replaced with the edited version of the file. Does this make sense? Has anyone else thought about this, or found an elegant solution to this? James Dinkel Network Engineer Butler County of Kansas There are 10 types of people in the world: those who understand binary, and those who don't. -- To unsubscribe from this list go to the following URL and read the instructions: https://lists.samba.org/mailman/listinfo/samba