At 06:57 PM 9/16/2003, you wrote:
On Tue, 16 Sep 2003 18:31:32 -0700
Ian L <[EMAIL PROTECTED]> wrote:

> Hey all,
>
> i made a stupid assumption ... tar would do something intelligently.
>
> I was running a back up such as: tar -cvlf file.tar -L 500000 /disk /disk2
>
> What i thought this would do is back up /disk and /disk2 in multiple tar
> files of 500megs each. Instead, it creates file.tar and when it hits
> 500megs, waits for me to hit return and then just continues where it
> stopped at, except it starts file.tar over.
>
> Can someone tell me a way that i can tar up (tar, zip, compress i dont
> care) a group of directories, and have them split up by whatever size i
> set? and have it be multiple files ... i dont want to sit there and
> manually rename files or anything. I want to let it run and come back later
> and have X number of archive files. Ultimately these will be burned on a
> dvd drive on a windows machine.
>
> thanks,
>


Hi Ian,

tar -cvlO /disk /disk2 | split -b 500000 - file.tar.

The -O option to tar says to send the archive to the standard output.
split will create files names "file.tar." plus a two letter extension for
as many files as it takes to store the input.

To get a listing of what's in the archive, something like:

cat file.tar.* | tar -tvf -

Of course you could add the -z or -j to the tar command so that the
output is compressed too.

Cheers,
Sean


much thanks ... especially since my idea wont work since there's a 2gb file size limit as ed pointed out. Oh well ... i'll have a backup system working eventually ... sometime this year. Last backup system i tried to set up didnt work too well since the company who made the tape drive went out of business about 2 weeks after i bought it. Anyone want a brand new onstream drive? :)

thanks again

Ian


-- redhat-list mailing list unsubscribe mailto:[EMAIL PROTECTED] https://www.redhat.com/mailman/listinfo/redhat-list

Reply via email to