Stephen,
Thanks for the reply. I've just looked at rsnapshot and it looks good. I
might give it a try. I think I'll steal your idea of "hourly backups to a
local disk, and daily backups to a remote server via rsync over ssh" If you
have a script you don't mind sharing shoot it over to me ei
> On 16-02-28 09:10 PM, Bruce Harding wrote:
>>
>> The home server houses a photo gallery with 13000 pictures and a blog.
>> Should the number of files have any bearing on the choice of backup?
Your choice in backup should be based on ease of use and reliability
first. Ease of use so that you use
___
Linux mailing list
Linux@lists.oclug.on.ca
http://oclug.on.ca/mailman/listinfo/linux
On 16-02-28 09:10 PM, Bruce Harding wrote:
Hello Bill long time no talk. I'm finally get around to making
backups of my home computer and home server. the first script I'm
looking at is creating backups on the machines which houses original
files. It uses tar to do this.
Next would be th
tl;dr, use rsync -aHAX.
On Sun, Feb 28, 2016 at 04:04:09PM -0500, Bruce wrote:
> Is a 50Gb or greater reasonable for a tar.gz backup?
Not unless you're backing up **to** a filesystem that doesn't support
all the file attributes you need (e.g., dreaded HFS+, NTFS, and FAT).
Especially not for…
>
For a one-time thing, tar is simple. If you are incrementally backing up
the same general data set, rsync will save a lot of time and datathroughput.
I think about the location of data as well - if it is local the
decisions are different than data remotely located across slower and
more expens
Is a 50Gb or greater reasonable for a tar.gz backup? I've read that a tar file
can be as big a 4Tb? Just wonder at what point is it considered normal
practiced to move from tar to rsync? For instance my home directory is 62Gb
and my media directory is 819Gb.
Just wondering what other people