On Wed 30 Oct 2019 05:23:30 PM CET, Max Reitz wrote: >> +Subcluster Allocation Bitmap (for standard clusters): >> + >> + Bit 0 - 31: Allocation status (one bit per subcluster) >> + >> + 1: the subcluster is allocated. In this case the >> + host cluster offset field must contain a valid >> + offset. >> + 0: the subcluster is not allocated. In this case >> + read requests shall go to the backing file or >> + return zeros if there is no backing file data. >> + >> + Bits are assigned starting from the most significant >> one. >> + (i.e. bit x is used for subcluster 31 - x) > > I seem to remember that someone proposed this bit ordering to you, but > I wonder why. So far everything in qcow2 starts from the least > significant bit, for example refcounts (“If refcount_bits implies a > sub-byte width, note that bit 0 means the least significant bit in > this context”), feature bits, and sub-byte structure descriptions in > general (which you reference directly with “bit x”). > > Soo... What’s the reason for doing it the other way around here?
The reason is that I thought that it would be better for debugging purposes. If I do an hexdump of the L2 table to see what's going on then starting from the most significant bit gives me a better visual image of what subclusters are allocated. In other words, if the first two subclusters are allocated I think this representation 11000000 00000000 00000000 00000000 (c0 00 00 00) is more natural than this one 00000000 00000000 00000000 00000011 (00 00 00 03) But I don't have a very strong opinion so I'm open to changing it. Berto