On Wed, 22 Oct 2003 05:51:00 -0400, HaywireMac
<[EMAIL PROTECTED]> wrote:
> The directory is about 630MB, and I left it going for 4.5 hours
> creating a tar.bz2, still wasn't done when I got up this AM.
It's been mentioned before, but is there a link somewhere to a higher
directory, making a loop?
On Wed, 22 Oct 2003 08:40:53 -0700
Eric Huff <[EMAIL PROTECTED]> uttered:
> YMMV, but i have heard people profess loudly not to use compression
> on backups. The tar isn't as small, but way less chance of
> corruption...
I tested 'em all afore I burned 'em, but thanks for the concern!
--
Haywi
> I want back up a directory to an archive for my upcoming big wipe
> for 9.2.
>
> The directory is about 630MB, and I left it going for 4.5 hours
> creating a tar.bz2, still wasn't done when I got up this AM.
>
> Is it faster (but less compression) just to do tar.gz?
YMMV, but i have heard peop
HaywireMac wrote:
On Wed, 22 Oct 2003 11:59:57 +0200
Raffaele Belardi <[EMAIL PROTECTED]> uttered:
$ tar cvzf archive_file.tgz archive_directory/
will tar and compress the contents of archive_directory/ into a file
named archive_file.tgz.
v is not mandatory, but gives you the idea of the prog
On Wed, 22 Oct 2003 11:59:57 +0200
Raffaele Belardi <[EMAIL PROTECTED]> uttered:
> $ tar cvzf archive_file.tgz archive_directory/
>
> will tar and compress the contents of archive_directory/ into a file
> named archive_file.tgz.
> v is not mandatory, but gives you the idea of the progress it's
>
You have to tell it what to tar and where to tar it to. tar -cfz
test.tar /home/test/.
Tony.
-Original Message-
From: HaywireMac [mailto:[EMAIL PROTECTED]
Sent: Wednesday, October 22, 2003 10:51 AM
To: Mandrake Newbs
Subject: [newbie] How long to create a large tar.gz
I want back up a
$ tar cvzf archive_file.tgz archive_directory/
will tar and compress the contents of archive_directory/ into a file
named archive_file.tgz.
v is not mandatory, but gives you the idea of the progress it's making.
On my PIII/733 it takes less than 15minutes to tar and gzip more than
600Mbyte of d