On 10/8/2020 8:09 AM, Michael Stone wrote:
On Thu, Oct 08, 2020 at 11:53:16AM +0200, Thomas Schmitt wrote:
Michael Stone wrote:
> I'd assume it's confusion between bits and bytes. [...]
> just write out bit or byte
Andrei POPESCU wrote:
SI prefixes can also help... if you use them consistently.
It is a classic that programs talk mixed about GB and GiB while not
clearly
distinguishing them. In general, users must keep the difference in
mind when
they compare "GB" values from different programs.
See
https://en.wikipedia.org/wiki/Mebibyte
This is basically never an issue in conversational usage as the
difference is less than the margin of error or real-world precision. If
I rather beg to differ. Many daily operations involve precision better
than the 2.4% variance between 1000 and 1024. Certainly my bank account
balance is maintained to within $.01 of more than $10,000, or 0.0001%.
I would be fairly distressed if $240 were missing from my account.
Much of the work I do involves tolerances of less than 1%.
you're planning for a million dollars worth of storage, yeah, make sure
you're clear on what you're buying. But when discussing a 10Gbit/s
network or a 4TByte drive, there isn't ambiguity. (Only, potentially,
pedantry.)
Well, what, really, is wrong with pedantry? As an engineer, precision
is absolutely of the essence. What's more, variations become far
tighter in certain circumstances. This especially as a container nears
being full. If one has 1 MB of storage available (allowing for file
system overhead and block alignment), then 1 MB of data will fit, but 1
MiB will not.
Not only that, but the discrepancy grows exponentially with the order
of magnitude. The difference between 1 KB and 1 KiB is only 24 bytes,
or 2.4%. The difference between 1 TB and 1 TiB is 9.9%, which is
getting to be pretty significant. That, not to mention the fact 93 GiB
is a pretty good chunk of storage.