Am 05.07.18 um 12:04 schrieb Steven D'Aprano:
On Thu, 05 Jul 2018 09:17:20 +0200, Christian Gollwitzer wrote:

Am 04.07.18 um 17:31 schrieb Steven D'Aprano:
On Wed, 04 Jul 2018 13:48:26 +0100, Bart wrote:

Presumably one type hint applies for the whole scope of the variable,
not just the one assignment.

You know how in C you can write

      int x = 1;  # the type applies for just this one assignment x =
      2.5;    # perfectly legal, right?


Not sure what point you are trying to make, but your example compiles in
C, if you replace the '#' comment sign with '//'.


Oops.


But... it compiles? Seriously?

Only it doesn't do
what you might think: the 2.5 is down-converted to an integer, therefore
x will be 2 in the end. There will be a compiler warning but no error.

Sometimes I wonder how C programmers manage to write a bug-free "Hello
World" program. No wonder it is described as a type-unsafe language or a
weakly-typed language.

Even this compiles:

#include <stdio.h>
int main() {
        int x=1;
        x="This is serious";
        printf("%d\n", x);
        return 0;
}
Apfelkiste:Tests chris$ gcc intx.c && ./a.out
intx.c:4:3: warning: incompatible pointer to integer conversion assigning to
      'int' from 'char [16]' [-Wint-conversion]
        x="This is serious";
         ^~~~~~~~~~~~~~~~~~
1 warning generated.
15294370


Assignment in C to an integer will only fail when the source is a struct. Everything else can be "converted" to an integer by the compiler.

I understand upcasting ints to floats, that's cool (even if a few
languages take a hard line on that too, I don't). I understand Python's
dynamic typing approach. I don't understand C requiring type
declarations, then down-casting floats to integers.

Without prototypes, it would be worse; the compiler would put the bit-pattern of a float onto the stack and reinterpret that as an integer. With prototypes there is a "sensible" result, namely the integer part of the float. At least it doesn't crash ("2.5" in C is a double constant, which is usually 64 bit, whereas int is usually 32 bit only).

At least it shows a warning. But hell, who pays attention to C compiler
warnings? There's *so many of them*.

In programs that are maintained, the warnings are typically taken seriously. Some warnings can be annoying, e.g. warnings from generated code about unused variables and such, but if you see a flood of warnings in regular code, that is a sign that the code has bad quality or wasn't ever tested on the platform you try it to compile.


(And I bet that by default the warning is disabled, amirite?)

Some people develop with -Werror (treat warnings as error and abort).

        Christian
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to