Re: Help creating incremental backups using --backup-dir.

2009-04-16 Thread Eric Bravick


David,

I haven't found any other file systems that directly support HFS meta 
data...  however, you mentioned in your post trying the mount a NAS 
based sparse file approach but that it was unreliable.  Honestly, I'd 
fix whatever on your network is making this unreliable - I use this 
method extensively and haven't had any issues with it.  I currently back 
up over 100 MACs with various forms of NAS mounts, sparse files, ditto, 
asr, and rsync.  I might have a couple failures a year running nightly 
backups, and honestly even when I've unmounted a sparse file dirty, its 
never resulted in data loss...  if I were you I'd head back towards that 
strategy and make it work.


henri wrote:

Hi David,

I am also interested to know if anyone has found a file system which 
will store Mac OS X meta data. In the mean time, I would suggest that 
you back up to another Mac OS X machine with a pull backup strategy.


--
Regards,
--
-- Eric Bravick, V.P. of Engineering
-- Networked Knowledge Systems, LLC
-- Email =  ebrav...@nks.net
-- Skype,AOL,Yahoo,MSN,Jabber,Google = ebravick
--
--
Please use reply-all for most replies to avoid omitting the mailing list.
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html


Re: Help creating incremental backups using --backup-dir.

2009-04-14 Thread henri
Have you considered pulling the backup via SSH rather than pushing it  
to the server over SMB? If pulling the backup is a possibility, I  
would recommend this configuration.


As an example if you run LBackup (rsync backend) on the server and  
pull the backup then hard-links will work great. The following link  
contains documentation regarding pulling and pushing backups with  
LBackup : http://connect.homeunix.com/lbackup/network_backup


The way LBackup handles push backups is via disk images (virtual file  
systems). This means that before the backup starts the virtual  
filesystem must be mounted and then when the backup has completed you  
may want to unmount the virtual file system. LBackup includes example  
pre and post scripts to do deal with the image mounting and un- 
mounting on Mac OS X by calling hdiutil. I have a similar setup in use  
at the moment which uses SSH Fuse to host the disk image file.  
However, this particular backup is started manually and is closely  
monitored.


Keep in mind that LBackup is designed to pull the backup to the backup  
server.


If you have to push the backups and you are looking for reliability,  
then LSync could be another option. Designed to work with pull or push  
backups and provide fail over support for servers, LSync may be a  
better fit if you must perform a push backup. Incremental support with  
LSync is provided via Link-Backup. Lsync details are available from  
the following URL : http://www.lucidsystems.org/tools/lsync


When using rsync to perform incremental backups with with hard links  
enabled, I have found that pulling backups is very robust. In  
addition, LBackup and LSync (expert system under construction) will  
also provide you with various options with regards backup reports.


Hopefully this information is helpful.


On 10/04/2009, at 6:11 AM, David Miller wrote:


Normally I would use the --link-dest option to do this but I can't
since I'm rsyncing from a Mac to a Samba share on a Linux box and hard
links don't work. What I want to do is create a 10 day rotating
incremental backup. I used the first script example on the rsync
examples page as a template. The only thing I changed was the
destination to be a local directory and paths for the other variables.
when I run the script nothing gets copied into the directories named
by the day of the week. Each day when the script runs the directory
with the name of the current week day is created but everything just
goes into current. and stays there. Can someone post an example that
does work for what I'm trying to do? Below is the script I'm using.


#---
# directory to backup
BDIR=$HOME/Documents

BACKUPDIR=`date +%A`
OPTS= -aX --force --progress --ignore-errors --delete --backup --
backup-dir=/$BACKUPDIR

# the following line clears the last weeks incremental directory
[ -d $HOME/emptydir ] || mkdir $HOME/emptydir
/usr/local/bin/rsync3.0.5 --delete -a $HOME/emptydir/ /Volumes/SAMBA/
$BACKUPDIR/
rmdir $HOME/emptydir

# now the actual transfer
/usr/local/bin/rsync3.0.5 $OPTS $BDIR /Users/Shared/current
#---

Thanks.
David.
--
Please use reply-all for most replies to avoid omitting the mailing  
list.

To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html


--
Please use reply-all for most replies to avoid omitting the mailing list.
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html


Re: Help creating incremental backups using --backup-dir.

2009-04-09 Thread David Miller


Ok, I figured out the problem. I had to put in the full path for the -- 
backup-dir option. However, I have ran into another problem that makes  
doing this just about useless. If I rsync to an HFS+ volume it works  
correctly. If I rsync to a Samba share it gives me errors and puts  
files it thinks have been modified at the time of sync into the -- 
backup-dir directory. It is going through and deleting all the ._  
files. The errors I'm seeing are as such.


rsync: get_xattr_names: llistxattr(Documents/web server diagram/ 
web.graffle/._image2.jpg,1024) failed: Operation not permitted (1)

deleting Documents/web server diagram/web.graffle/._image2.jpg

I have checked the Samba server and the files are being set with the  
correct owner, group, permissions.


Are there any filesystems under linux that allow the proper  storage  
of the Mac metadata? I have tried XFS, ext3 and ext4 with no luck. I  
even tried creating a sparse disk image and mounting that from a Samba  
share but it is too unreliable. If there is a connection loss  while  
data is writing to the image it corrupts the image more often than not.


David.
On Apr 9, 2009, at 11:11 AM, David Miller wrote:

Normally I would use the --link-dest option to do this but I can't  
since I'm rsyncing from a Mac to a Samba share on a Linux box and  
hard links don't work. What I want to do is create a 10 day rotating  
incremental backup. I used the first script example on the rsync  
examples page as a template. The only thing I changed was the  
destination to be a local directory and paths for the other  
variables. when I run the script nothing gets copied into the  
directories named by the day of the week. Each day when the script  
runs the directory with the name of the current week day is created  
but everything just goes into current. and stays there. Can someone  
post an example that does work for what I'm trying to do? Below is  
the script I'm using.



#---
# directory to backup
BDIR=$HOME/Documents

BACKUPDIR=`date +%A`
OPTS= -aX --force --progress --ignore-errors --delete --backup -- 
backup-dir=/$BACKUPDIR


# the following line clears the last weeks incremental directory
[ -d $HOME/emptydir ] || mkdir $HOME/emptydir
/usr/local/bin/rsync3.0.5 --delete -a $HOME/emptydir/ /Volumes/SAMBA/ 
$BACKUPDIR/

rmdir $HOME/emptydir

# now the actual transfer
/usr/local/bin/rsync3.0.5 $OPTS $BDIR /Users/Shared/current
#---

Thanks.
David.
--
Please use reply-all for most replies to avoid omitting the mailing  
list.

To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html


--
Please use reply-all for most replies to avoid omitting the mailing list.
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html