Public bug reported:

Hello
In my case, I extract large archives (~1.5Gb in compressed state 7zip --> ~5Gb 
in uncompressed state). Extracted copy != oroginal, I compare files with 
vbindiff.

I do not know is whether this information because in each extract data
is lost in various areas.

Example (in extracted copy):
199B CD5D - 199B CFFF Filled by 0x00h
2E79 A438 - 2E79 AFFF Filled by 0x00h
36B9 93DB - 36B9 9FFF Filled by 0x00h
71EF 1E45 - 71EF 1FFF Filled by 0x00h
7AFF 0C5A - 7AFF 0FFF Filled by 0x00h
95CE D752 - 95CF DFFF Filled by 0x00h
... etc ...

As can be seen always missing data are replaced by zeros and block sizes
are different, but justified on a boundary x0FFF.

Ubuntu 11.04 Natty x86 2.6.37-9-generic-pae

** Affects: ubuntu
     Importance: Undecided
         Status: New

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/691454

Title:
  Data is lost when writing large amounts of data to disk

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to