Hi 

I am trying to tar and gzip some very large directories, using tar
version 1.12-4. 
I find that each time I get the message: 

"stdout File too large"

on the command line when the job finishes (or exists too soon?). 

I thought the first time that the job didn't complete properly, that is,
that the tar job exited before completing because what I was trying to
tar up was simply too large. So I spit the directories to be tarred up 
into two (directories A to L in one tar job, and directories M to Z in
the other). So each tar archive had to tar up half of what was done
before. I still got the output "stdout File too large, for both tar
archives. 

I am starting to wonder if I have to worry about this message. The tar
archives are both rather large so that maybe everything is tarred up
after all. However, how can I be sure - this is a bit hard to verify. 

Does anyone have any experience with getting such a message? Does anyone
knows whether the tar archive is OK or not?

Any information will be most appreciated. 

Thanks

Hugo 

-- 
Dr Hugo Bouckaert - Systems Administrator, Computer Science UWA
Tel: +(61 8) 9380 2878 / Fax: +(61 8) 9380 1089
Email: [EMAIL PROTECTED] / Web: http://www.cs.uwa.edu.au/~hugo

Reply via email to