https://gcc.gnu.org/bugzilla/show_bug.cgi?id=107815

--- Comment #19 from dave.anglin at bell dot net ---
On 2022-11-28 4:39 a.m., jakub at gcc dot gnu.org wrote:
> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=107815
>
> --- Comment #18 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
> Or better yet
> #include <stdlib.h>
> #include <stdio.h>
>
> int
> main ()
> {
>    char *end;
>    const char *p = "6e-4966";
>    long double l = strtold (p, &end);
>    if (l != __LDBL_DENORM_MIN__ || end != p + 7)
>      printf ("%Le %s\n", l, end);
>    p = "1e-4965";
>    l = strtold (p, &end);
>    if (l != 2.0L * __LDBL_DENORM_MIN__ || end != p + 7)
>      printf ("%Le %s\n", l, end);
>    p = "2e-4965";
>    l = strtold (p, &end);
>    if (l != 3.0L * __LDBL_DENORM_MIN__ || end != p + 7)
>      printf ("%Le %s\n", l, end);
>    return 0;
> }
> so that we know if it is just the denorm_min() case or also other denormals.
I tried both test cases with a recent build of gcc-12. Neither failed at O0 or
O2. Nothing was printed.

Reply via email to