Greetings!

I am using cdbkup to (what else?) backup my system on CD. It is quite the 
reliable program, and I was very surprised to find it had a nasty bug.

In short, whenever I tried to perform a backup that didn't fit on one CD, the 
program would create ISO images that were consistently too large for my CDs.

I dug around and found the problem, but have no clue as to the cause. In the 
section that gathers the data to be backed up, the program invokes gtar as 
such:

open (INPUT, "(cd / && /bin/gtar -f - -cSlz 
--listed-incremental=/tmp/cdbkup-root --label=root-2003-11-30-0 
--exclude=./home/backup --exclude=./root/root-2003-11-30-0.tar.gz 
--exclude=./tmp/cdbkup-root .)|");

Later, the application reads from the pipe as such:

        $imgsize = 0;
        open( TMPIMG, ">$tmpimg") || die "Can't open '$tmpimg': $!\n";
        while( $rc = read( INPUT, $data, $increment)) {
                $imgsize += $rc;
                print TMPIMG $data || die "Can't write data to tmp file: 
$!\n";
                last if( netsize($imgsize + $increment) > $maxsize);
        }

Now, the $imgsize variable is consistently smaller than the real file size as 
reported by ls.

I checked as many FAQs as I could, and found a reference to 'use bytes' vs 'no 
bytes'. I tried both, and it made no difference.

I am sure this is something trivial, but really need help.

Thanks
Marco



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to