Jens Alfke wrote:

> Which is to say that, if you really want to engage in productive debate or provide alternatives, you should spend some time learning the theory behind languages and also looking at non-C-like languages, especially functional ones.

This. C doesn't even have a typesystem: just a handful of compiler directives for allocating memory on the stack. If your only experience is in C and its descendents, you're simply not qualified to discuss type system design. Go learn a language like Haskell or ML that has a real type system; it'll open your eyes. Type systems are for expressing your exact requirements in terms of set theory, not checking homework of the terminally sloppy and lazy.


> Optionals come out of a long line of thinking in functional programming languages. The broader idea is that if a value can have multiple mutually exclusive states (in this case “has a value” vs “has no value”) then those states should be tagged and should require explicit action to select between. That’s basically what Swift enums are (and the same concept is found in a lot of other languages like Haskell, Erlang, Scala, Rust…)

Indeed. The common term in FP is "sum types". (Also known as" variant types" or "tagged unions", though I dislike that last term as it emphasizes implementation rather than purpose - another easy engineer trap.) Here's a quick read:

 
https://www.fpcomplete.com/school/to-infinity-and-beyond/pick-of-the-week/sum-types

Basically, just think of `FooType?` as a syntactic shorthand for writing `FooType | NoneType`, i.e. the sum of FooType and NoneType. [1]

Similarly, `foo = foo_or_none as FooType!` is just shorthand for concisely expressing a common use-case where your code can't reasonably continue unless the value given is an instance of FooType:

    case foo_or_none of
        SomeType (foo) -> [process the foo value as normal]
        NoneType -> [throw a standard exception]

Frankly, if you want to grouse about something, grouse about Swift's love of syntactic special forms, which makes the language look even more semantically complex and disjointed than it actually is. Having cut my coder's teeth on AppleScript, I know this special hell only too well already.


> There’s a school of thought that null pointers are harmful; optionals are a reaction to that. I just looked up the source — Tony Hoare gave a presentation where he formally apologized for inventing null pointers in 1965 as part of ALGOL W: [...] It’s a great quote, but I don’t think that was the first appearance of null. LISP dates back to the late ‘50s and has always had nil references (right?)

Lisp has a `nil` object. That's not the same thing as a nil pointer. The first is an actual Thing; the second is a promise to give you a thing that instead drops you down a hole when you actually ask for it.


HTH

has

[1] Pseudocode, obviously. I *really* wish Swift designers had copied FP's elegant type declaration and pattern matching syntax, instead of godawful C++ hideousness. It's so much cleaner it isn't funny.
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to