https://gcc.gnu.org/bugzilla/show_bug.cgi?id=92639

Jerry DeLisle <jvdelisle at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jvdelisle at gcc dot gnu.org

--- Comment #3 from Jerry DeLisle <jvdelisle at gcc dot gnu.org> ---
Please observe:

program demonstrate
  integer :: i
  i = huge(i)
  print *, i

  i= -huge(i)
  print *, i
end program

$ gfc demonstrate.f90 
$ ./a.out 
  2147483647
 -2147483647

This is by the fortran standard definition for a 32 bit integer.  You cannot
represent 2147483648 in 32 bits because you need bit 32 to represent the sign
of the value. How other compilers are storing the sign I can not speak to.

Reply via email to