Reading through your post brought back many memories of how I used to manage my 
data.

I also found SuperDuper and Carbon Copy Cloner great for making a duplicate of 
my Mac's boot drive, which also contained my data.

After juggling around with cloning boot/data drives and using non-redundant 
Time Machine backups etc, plus some manual copies here and there, I said 'there 
must be a better way' and so the long search ended up with the idea of having 
fairly 'dumb' boot drives containing OS and apps for each desktop PC and moving 
the data itself onto a redundant RAID NAS using ZFS. I won't bore you with the 
details any more -- see the link below if it's interesting. BTW, I still use 
SuperDuper for cloning my boot drive and it IS terrific.

Regardless of where the data is, one still needs to do backups, like you say. 
Indeed, I know all about scrub and do that regularly and that is a great tool 
to guard against silent failure aka bit rot.

Once your data is centralised, making data backups becomes easier, although 
other problems like the human factor still come into play :)

If I left my backup system switched on 24/7 it would in theory be fairly easy 
to (1) automate NAS snapshots and then (2) automate zfs sends of the 
incremental differences between snapshots, but I don't want to spend the money 
on electricity for that.

And when buying drives every few years, I always try to take advantage of 
Moore's law.

Cheers,
Simon

http://breden.org.uk/2008/03/02/a-home-fileserver-using-zfs/
-- 
This message posted from opensolaris.org
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to