Hi,

I have a large directory (2.8 Gig)...

bash-2.05b$ du -sh Homebase-BillJ-20050429/
2.8G    Homebase-BillJ-20050429/

....I want to save this onto cd rom.

Cd roms only like about 600 to 700 Meg, right?

So, I need to do it across multiple cds.  I am using a Mac (OS 10.3.9) but
might go back to linux.

Anyway wisdom on the best way to do this?  My thinking is that
Way2 is better than Way1.  Do you agree or disagree?

Here are the ways...

WAY 1) split it manually.  This is what someone recommmended.  "Do the
size calculations and copy over the stuff that will fit onto one CD."
I suppose that means finding a set of directories that would fit on
one CD and copy them over.  This seems like a real pain.  And also,
how would I preserve the whole directory structure?

WAY 2) tar the directory with somethign like...

tar czvf bigtarball.tar.gz Homebase-BillJ-20050429

...then use the command split.  Like so....

split -b550m bigtarball.tar.gz bigtarballpart

...I say -b550m so that split will split it into pieces (the names of
which begin "bigtarballpart") that are 550 megs in size.

Then, possibly years later, when I want to extract all this stuff on
some other machine, then after putting the cd rom files into the same
directory, I do this...

cat bigtarballpart* > resurrectedstuff

...and then...

tar xzvf resurrectedstuff

...Does Way2 seem correct?  Is it a good reliable way of doing it?

Would you do it a better way?

Thanks,

Bill

_______________________________________________
Siglinux mailing list
[email protected]
http://www.utacm.org:81/mailman/listinfo/siglinux

Reply via email to