tl;dr, use rsync -aHAX.

On Sun, Feb 28, 2016 at 04:04:09PM -0500, Bruce wrote:
> Is a 50Gb or greater reasonable for a tar.gz backup?

Not unless you're backing up **to** a filesystem that doesn't support
all the file attributes you need (e.g., dreaded HFS+, NTFS, and FAT).
Especially not for…

> For instance my home directory is 62Gb and my media directory is
> 819Gb.

819 GB? Not only is tar not really that seekable, but compression kind
of makes that hard. At that archive size (or however smaller
compressed), if you ever need random access, you'll be seeking for a
while, or worse, reading all the data before wherever your file or
directory is in the archive if it's compressed. Worse if you assume that
the archive was appended to, which means that you have to read until the
very end.

Now if this thing is on external media, most *likely* connected over
USB, most *likely* a hard drive rather than an SSD, you'll be sitting
there for a *long* time. Have fun trying to do the equivalent over sshfs
when logging into home from work over crappy Wi-Fi.

On a side note…

ustar, gnutar, or pax? I'd recommend you use pax or gnutar to at least
preserve the attributes. pax is more portable, in theory, and extensible.

> Just wonder at what point is it considered normal practiced to move
> from tar to rsync?

Like Strossberg said, when you expect incremental backups, which should
be the norm now for what your use case would sound like.

> Just wondering what other people are doing.

Well I know there are two other rsync nuts on this list.
_______________________________________________
Linux mailing list
Linux@lists.oclug.on.ca
http://oclug.on.ca/mailman/listinfo/linux

Reply via email to