i doubt that dd will do much better than tar, maybe a few percent because it
can skip the filesystem layer BUT you will have to take an entire image of
the drive as dd doesnt do file level backups.
are you pushing tar through gzip? maybe try a lower level compression if
you CPU is overworked. you should be able to tar up 76GB from one disk to
another pretty fast. 76GB shouldnt have take more than an hour or two. to
you have DMA on? what is the interface on your drives? i have a
750GB(~700MB used) array that i can dump to a USB disk with tar & gzip in
about 8-10 hours. i now use rsync because the bulk of the files are in
place but i always use tar to do initial dumps for speed. rsync taken a few
hours more. i do have a fast system on a RAID5 but im dumping it to a USB
disk which is an obvious bottleneck.
it seems like a lot of people are having performance issues that i cant
duplicate. im wondering if everyone is using hdparm on boot to set DMA or
what the issue is.
On Jan 3, 2008 8:47 PM, Sean Carolan <[EMAIL PROTECTED]> wrote:
> Thanks, Dan. I"m actually running tar right now and it's into it's
> second day, 76GB done so far. Maybe I'll have better luck with dd,
> since tar is a bit slow!
>
> On Jan 3, 2008 4:35 PM, dan < [EMAIL PROTECTED]> wrote:
> > heres a thought, for an external backup of the disk, dont use rsync.
> just
> > copy it over to the external drive. you could jsut tar.gz it over and
> > preserve the hard links also.
> >
> >
> >
> > On Jan 3, 2008 7:29 AM, Sean Carolan < [EMAIL PROTECTED]> wrote:
> > >
> > >
> > >
> > > > This is topic is discussed pretty regularly on this mailinglist.
> > > > Please also search the archives.
> > >
> > > Thanks, Nils. If the sourceforge.net mailing list search engine were
> > > not so broken, I would gladly have combed through the archives. As it
> > > stands I was unable to successfully search even for common words like
> > > "USB" or "external", both of which returned no results. I even tried
> > > the "advanced search" but to no avail.
> > >
> > > > Because of the heavy use of hardlinks
> > > > breaking the pool up into smaller batches is not really feasible and
> > > > indeed rsync doesn't really handle very large numbers of files and
> > > > hardlinks (because it needs to load the full trees in memory). I
> > > > believe the most common solution is not to backup the backup server,
> > > > but to use RAID (and rotate disks offsite) or use a second backup
> > > > server (so each host is backed up separately by each backup server).
>
> > >
> > > Thanks, this is exactly what I needed to know. I'm working out a
> > > system to simply tar up all the files once a week, as a last-resort
> > > recovery option.
> > >
> > > Regards,
> > >
> > >
> > > Sean
> > >
> > >
> -------------------------------------------------------------------------
> > >
> > > This SF.net email is sponsored by: Microsoft
> > > Defy all challenges. Microsoft(R) Visual Studio 2005.
> > > http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
> > > _______________________________________________
> > > BackupPC-users mailing list
> > > BackupPC-users@lists.sourceforge.net
> > > List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
> > > Wiki: http://backuppc.wiki.sourceforge.net
> > > Project: http://backuppc.sourceforge.net/
> > >
> >
> >
>
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/