Walter Bright wrote:
"Nobody notices they are immutable, they just work."

So what is it about immutability that makes strings "just work" in a natural and intuitive manner? The insight is that it enables strings, which are reference types, to behave exactly as if they were value types.

After all, it never occurs to anyone to think that the integer 123 could be a "mutable" integer and perhaps be 133 the next time you look at it. If you put 123 into a variable, it stays 123. It's immutable. People intuitively expect strings to behave the same way. Only C programmers expect that once they assign a string to a variable, that string may change in place.

C has it backwards by making strings mutable, and it's one of the main reasons why dealing with strings in C is such a gigantic pain. But as a longtime C programmer, I was so used to that I didn't notice what a pain it was until I started using other languages where string manipulation was a breeze.

I could fall into infinite loop while agreeing with you.

Cheers

Reply via email to