On Friday, 25 May 2018 at 22:07:22 UTC, Dukc wrote:
On Friday, 25 May 2018 at 21:06:17 UTC, Walter Bright wrote:

This ambiguity bug with + has been causing well-known problems since Algol. A *really* long time. Yet it gets constantly welded into new languages.

Yeah. I could understand that choice for a language that tries to be simple for beginners above everything else. But for large-scale application language like C#, I quess this just did not occur to them.

I used to program in C# quite regularly and never had this issue. It is not a problem of the language but a problem of the programmer.

A programmer should always know the types he is working and the functional semantics used. While it obviously has the potential to cause more problems it is not a huge deal in general. I might have been caught by that "bug" once or twice but it's usually an obvious fix. If you are moving from one language to another or haven't programming in one much you will have these types of problems, but they go away with experience. To degrade the language based on that is wrong. Languages should not be designed around noobs because then the programmers of that language stay noobs. Think BASIC. If all you did was programmed in basic then you would be considered a novice programmer by today's standards. If even you were an expert BASIC programmer, when you moved to a modern language you would be confused. For you to say that those languages are inferior because they don't do things like BASIC would be wrong, it is your unfamiliarity with the language and newer programming concepts that are the problem.

A language will never solve all your problems as a programmer, else it would write the programs for us.


Reply via email to