On 3.11.2025 15.45, Roberto E. Vargas Caballero wrote>> This is actually
a simple case of passing wrongly sized buffer to
sha512_sum_n() when finalizing, and happens with any input. The problem lies
Ok, that makes sense. I suppose the buffer passes depends of the 'n' used
in the specific hash.
Yes, in case of SHA-512/224, only the 224 first bits of the produced
data are used. But as the 'n' is in units of 64-bit items, 224 cannot be
represented as an integer.
with sha512_sum_n() requiring the digest length argument to be in multiples
of uint64_t, but for 224 bits this would be 3.5. Either the argument needs
to be changed to be more granular, or the sha512_224_sum() in
libutil/sha512-224.c needs to use a temporary buffer.
uhmmmm, using an internal buffer can produce some problems in threaded programs.
Maybe we can add a buffer to the ctxt parameter. Do you think the problem can
be solved in that way?
The buffer would be only 32 bytes, should pose no problems just having
the temp array as a local variable in the sha512_224_sum() function,
something like:
void
sha512_224_sum(void *ctx, uint8_t md[SHA512_224_DIGEST_LENGTH])
{
uint8_t buf[32];
sha512_sum_n(ctx, buf, 4);
memcpy(md, buf, SHA512_224_DIGEST_LENGTH);
}
Otherwise sha512_sum_n() needs to be adjusted to take the digest size in
some other unit.
The SHA-256 based sha224sum actually has the same problem, but there the fix
is simpler, just changing the argument of sha256_sum_n to 7, as SHA-256
internally uses 32-bit ints.
Sorry, I didn't understand this point. I don't see the relation between 7 and
the internal 32-bit ints. Can you elaborate it a bit more? (take in account that
I didn't write that code).
Sorry, it was a bit unclear. sha256_sum_n expects the size argument to
be in units of the internal state array, which is of 32-bit ints. Now as
224 is an exact multiple of 32, the overflow can be simply fixed by
passing in the correct 224/32 = 7 (instead of the current 8).
--
Santtu