Re: Optionals? A better option!

2015-05-16 Thread Uli Kusterer
On 14 May 2015, at 18:50, William Squires  wrote:
>  Swift shows promise, but - like all computer languages - it's a trade-off 
> between generated code side, and the amount of abstraction the language 
> presents to those using it, making it easier to turn ideas into code. Making 
> a (more abstract) language unnecessarily complicated with '?' and '!' doesn't 
> seem to be going in the right direction, while making string concatenation 
> less complicated with an overloaded '+' operator does seem to be going in the 
> right direction (now, if we can only make extracting substrings less 
> complicated, that'll be even better! It's currently a 
> pain-in-the-compiler-backend!)

 If you want a fun, concise argument against overloading the same operator with 
different meanings: 
https://www.destroyallsoftware.com/talks/wat
Admittedly, that’s exacerbated by using a language with transparent type 
conversion, but it’s still a bad idea.

Cheers,
-- Uli Kusterer
“The Witnesses of TeachText are everywhere...”
http://zathras.de


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-15 Thread Jens Alfke

> On May 15, 2015, at 10:12 AM, Scott Ribe  wrote:
> 
> As in the olden days of OS 9 & before, when you could freely read & write 
> through location 0, usually leading to great hilarity…

Especially since the 68k CPU interrupt vectors were stored in low memory a few 
bytes from 0, so a memcpy or struct write to NULL would cause an immediate 
total system freeze.

Anyone who was developing for the Mac in those dark days remembers the 
indispensable system extension EvenBetterBusError (son of Mr. Bus Error), which 
would every few milliseconds (a) write an invalid address to location 0, and 
(b) trigger an exception if that value had been changed since it last wrote it. 
It wouldn’t catch dereferencing a null pointer, but it would catch 
dereferencing a null _handle_, and those were at least as common back then.

See: 
http://www.mactech.com/articles/mactech/Vol.13/13.05/CodeMechanic/index.html 


—Jens
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-15 Thread Scott Ribe
On May 15, 2015, at 11:06 AM, Kyle Sluder  wrote:
> 
> It's not valid to dereference a null pointer, but what happens when you
> do is undefined.

As in the olden days of OS 9 & before, when you could freely read & write 
through location 0, usually leading to great hilarity…

-- 
Scott Ribe
scott_r...@elevated-dev.com
http://www.elevated-dev.com/
https://www.linkedin.com/in/scottribe/
(303) 722-0567 voice






___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-15 Thread Kyle Sluder
On Fri, May 15, 2015, at 11:31 AM, Jens Alfke wrote:
> Yes, but when you evaluate (cdr nil) doesn’t the interpreter trigger an
> exception? (Sorry, it’s been decades since I used LISP.) Similarly, in
> Smalltalk-80 ‘null’ is a real object, a singleton of class
> UndefinedObject, but trying to message it is going to raise a
> message-not-handled exception. The difference from a nil pointer is
> mostly that you get a language-level rather than an OS-level exception.

Just as a point of esoterica, nothing about the C spec requires the null
pointer to map to a hardware-level trap. The only requirement is that
there exist a value called the null pointer in every pointer type, that
all null pointers compare equal to each other, and that they do not
compare equal to any non-null pointers.

It's not valid to dereference a null pointer, but what happens when you
do is undefined.

The assumption that dereferencing a null pointer would always lead to a
hardware fault led to a pretty serious vulnerability in the Linux
kernel: http://lwn.net/Articles/342330/

Here's some more elaboration:
https://software.intel.com/en-us/blogs/2015/04/20/null-pointer-dereferencing-causes-undefined-behavior

And all of this is why Swift and every other modern programming
languages goes the route of Optionals.

--Kyle Sluder

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-15 Thread Jens Alfke

> On May 15, 2015, at 4:34 AM, has  wrote:
> 
> Lisp has a `nil` object. That's not the same thing as a nil pointer. The 
> first is an actual Thing; the second is a promise to give you a thing that 
> instead drops you down a hole when you actually ask for it.

Yes, but when you evaluate (cdr nil) doesn’t the interpreter trigger an 
exception? (Sorry, it’s been decades since I used LISP.) Similarly, in 
Smalltalk-80 ‘null’ is a real object, a singleton of class UndefinedObject, but 
trying to message it is going to raise a message-not-handled exception. The 
difference from a nil pointer is mostly that you get a language-level rather 
than an OS-level exception.

> I *really* wish Swift designers had copied FP's elegant type declaration and 
> pattern matching syntax, instead of godawful C++ hideousness. It's so much 
> cleaner it isn't funny.

I said similar things when Java first came out — “it’s a whole new language! 
Why did they have to copy the awful C syntax?” — but from a pragmatic 
standpoint I was wrong. A big part of the reason Java took off was that it 
looked familiar to C and C++ programmers. A language with an awesome syntax is 
still a failure if people won’t adopt it. (Of course you can take that too far 
end end up with miserable pidgins like PHP…) For better or worse, all* of the 
languages I know of in the new Modern Systems Programming category (Go, Rust, 
Swift, …) have a surface-level similarity to C.

—Jens

* Oops, Nim doesn't. But Nim also doesn’t seem to have as much traction, which 
bolsters my point.
___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-15 Thread has

Jens Alfke wrote:

> Which is to say that, if you really want to engage in productive 
debate or provide alternatives, you should spend some time learning the 
theory behind languages and also looking at non-C-like languages, 
especially functional ones.


This. C doesn't even have a typesystem: just a handful of compiler 
directives for allocating memory on the stack. If your only experience 
is in C and its descendents, you're simply not qualified to discuss type 
system design. Go learn a language like Haskell or ML that has a real 
type system; it'll open your eyes. Type systems are for expressing your 
exact requirements in terms of set theory, not checking homework of the 
terminally sloppy and lazy.



> Optionals come out of a long line of thinking in functional 
programming languages. The broader idea is that if a value can have 
multiple mutually exclusive states (in this case “has a value” vs “has 
no value”) then those states should be tagged and should require 
explicit action to select between. That’s basically what Swift enums are 
(and the same concept is found in a lot of other languages like Haskell, 
Erlang, Scala, Rust…)


Indeed. The common term in FP is "sum types". (Also known as" variant 
types" or "tagged unions", though I dislike that last term as it 
emphasizes implementation rather than purpose - another easy engineer 
trap.) Here's a quick read:


 
https://www.fpcomplete.com/school/to-infinity-and-beyond/pick-of-the-week/sum-types

Basically, just think of `FooType?` as a syntactic shorthand for writing 
`FooType | NoneType`, i.e. the sum of FooType and NoneType. [1]


Similarly, `foo = foo_or_none as FooType!` is just shorthand for 
concisely expressing a common use-case where your code can't reasonably 
continue unless the value given is an instance of FooType:


case foo_or_none of
SomeType (foo) -> [process the foo value as normal]
NoneType -> [throw a standard exception]

Frankly, if you want to grouse about something, grouse about Swift's 
love of syntactic special forms, which makes the language look even more 
semantically complex and disjointed than it actually is. Having cut my 
coder's teeth on AppleScript, I know this special hell only too well 
already.



> There’s a school of thought that null pointers are harmful; optionals 
are a reaction to that. I just looked up the source — Tony Hoare gave a 
presentation where he formally apologized for inventing null pointers in 
1965 as part of ALGOL W: [...] It’s a great quote, but I don’t think 
that was the first appearance of null. LISP dates back to the late ‘50s 
and has always had nil references (right?)


Lisp has a `nil` object. That's not the same thing as a nil pointer. The 
first is an actual Thing; the second is a promise to give you a thing 
that instead drops you down a hole when you actually ask for it.



HTH

has

[1] Pseudocode, obviously. I *really* wish Swift designers had copied 
FP's elegant type declaration and pattern matching syntax, instead of 
godawful C++ hideousness. It's so much cleaner it isn't funny.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-14 Thread Jens Alfke
With all due respect, I think you’re falling into the common engineer pitfall 
of jumping to the conclusion that there’s a trivial solution without first 
understanding the problem. (Sometimes expressed as “any bug in your program is 
trivial; any bug I have to fix is intractable.”) Which is to say that, if you 
really want to engage in productive debate or provide alternatives, you should 
spend some time learning the theory behind languages and also looking at 
non-C-like languages, especially functional ones. (Apologies if you’ve got such 
experience, but based on your answer I’m guessing you don’t.)

I’m not terribly qualified here; I’ve dabbled in language and compiler design 
in the distant past. Not enough to make me an expert, but enough to make me 
respect that there are a lot of hidden difficulties and ramifications to even 
simple language features.

Optionals come out of a long line of thinking in functional programming 
languages. The broader idea is that if a value can have multiple mutually 
exclusive states (in this case “has a value” vs “has no value”) then those 
states should be tagged and should require explicit action to select between. 
That’s basically what Swift enums are (and the same concept is found in a lot 
of other languages like Haskell, Erlang, Scala, Rust…)

There’s a school of thought that null pointers are harmful; optionals are a 
reaction to that. I just looked up the source — Tony Hoare gave a presentation 
where he formally apologized for inventing null pointers in 1965 as part of 
ALGOL W:

"I call it my billion-dollar mistake. It was the invention of the null 
reference in 1965. At that time, I was designing the first comprehensive type 
system for references in an object oriented language (ALGOL W). My goal was to 
ensure that all use of references should be absolutely safe, with checking 
performed automatically by the compiler. But I couldn't resist the temptation 
to put in a null reference, simply because it was so easy to implement. This 
has led to innumerable errors, vulnerabilities, and system crashes, which have 
probably caused a billion dollars of pain and damage in the last forty years.”
— http://en.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retractions 


It’s a great quote, but I don’t think that was the first appearance of null. 
LISP dates back to the late ‘50s and has always had nil references (right?)

—Jens


___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-14 Thread Quincey Morris
On May 14, 2015, at 10:52 , Kyle Sluder  wrote:
> 
> FWIW, NSNotFound is defined as NSIntegerMax, so it has the same value in
> both signed and unsigned contexts.

a. Oh, yeah, I knew that. 

b. It kinda proves my point, though. I write enough Obj-C code that I shouldn’t 
confuse myself about this, but I just did.



___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-14 Thread Kyle Sluder
On Thu, May 14, 2015, at 12:34 PM, Quincey Morris wrote:
> — It’s used in both signed and unsigned contexts, so it really has 2
> values

FWIW, NSNotFound is defined as NSIntegerMax, so it has the same value in
both signed and unsigned contexts.

--Kyle Sluder

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Re: Optionals? A better option!

2015-05-14 Thread Quincey Morris
On May 14, 2015, at 09:50 , William Squires  wrote:
> 
>  Have the compiler/linker enforce that all variables are initialized to zero 
> (Int, Float, Double), false (Bool), empty (String, array, dictionary), or nil 
> (object reference) if the coder doesn't specify them. (in the case of an 
> enumeration, it would either be the 1st enumerated constant, or the one whose 
> raw value = 0; implicitly or explicitly)

The problem being solved in Swift is to *eliminate* the brute-force 
initialization of instance variables. That’s in part because the Obj-C standard 
of getting all-zero-bits isn’t necessarily the correct value in all cases, and 
in part because the zeroing is duplicative when there is code to initialize 
ivars explicitly too.

You may think this duplication is minor, but it starts to matter when the 
language (e.g. Swift) has “value classes” (i.e. structs that can have 
initializers and methods). In that case, there can be a lot of creating new 
struct instances, and it’s desirable to avoid emitting code for unnecessary 
initializations.

> There's not really any need for optional values if all variables are always 
> initialized.

I think you’re utterly wrong about this. Optionality is not just about making 
sure that variables are initialized. It’s also about introducing a language 
construct that expresses the fact that some variables don’t always have a 
value. For example, you can’t put nil pointers in a NSArray, which means you 
can’t easily have a sparse array. Similarly, you don’t have a direct way of 
expressing the fact that a function might return a value or not. This is not 
about initialization, it’s about optionality of values, so why not have 
optionality as a first class language concept?

> And - as a programmer - you can always revert to initializing it to a 
> 'canary' value (with an appropriately named constant; no magic numbers, 
> please!) that you can test against after receiving user input, which IMHO, is 
> good defensive programming, anyway.

Yes, we have [NSNull null] and NSNotFound, for example. However, these are 
*terrible* solutions to the problem — though the only solution Obj-C has, so 
they don’t always seem terrible when you’re used to them.

The problem with [NSNull null] is that it has the wrong class. That means you 
always have to code around it.

NSNotFound has several problems:

— You don’t (officially) know *where* in the range of possible integer numbers 
it is, so you don’t know when calculations near that value are dangerous

— It’s used in both signed and unsigned contexts, so it really has 2 values

— Archiving a variable with the value NSNotFound is dangerous, because it’s not 
32/64-bit agnostic. That is, if you archive it when NSUInteger is 32 bits, and 
unarchive it when NSUInteger is 64 bits, it’s not going to be NSNotFound any 
more.

___

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com