> On Oct 2, 2017, at 7:57 PM, Xiaodi Wu <xiaodi...@gmail.com> wrote:
> 
> On Mon, Oct 2, 2017 at 9:04 PM, David Sweeris <daveswee...@mac.com 
> <mailto:daveswee...@mac.com>> wrote:
> 
> On Oct 2, 2017, at 5:45 PM, Xiaodi Wu via swift-evolution 
> <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
> 
>> On Mon, Oct 2, 2017 at 19:28 Ethan Tira-Thompson via swift-evolution 
>> <swift-evolution@swift.org <mailto:swift-evolution@swift.org>> wrote:
>> I’m all for fixing pressing issues requested by Xiaodi, but beyond that I 
>> request we give a little more thought to the long term direction.
>> 
>> My 2¢ is I’ve been convinced that very few characters are “obviously” either 
>> a operator or identifier across all contexts where they might be used.  Thus 
>> relegating the vast majority of thousands of ambiguous characters to 
>> committee to decide a single global usage.  But that is both a huge time 
>> sink and fundamentally flawed in approach due to the contextual dependency 
>> of who is using them.
>> 
>> For example, if a developer finds a set of symbols which perfectly denote 
>> some niche concept, do you really expect the developer to submit a proposal 
>> and wait months/years to get the characters classified and then a new 
>> compiler version to be distributed, all so that developer can adopt his/her 
>> own notation?
>> 
>> The Unicode Consortium already has a document describing which Unicode 
>> characters are suitable identifiers in programming languages, with guidance 
>> as to how to customize that list around the edges. This is already adopted 
>> by other programming languages. So, with little design effort, that task is 
>> not only doable but largely done.
>> 
>> As to operators, again, I am of the strong opinion that making it possible 
>> for developers to adopt any preferred notation for any purpose (a) is 
>> fundamentally incompatible with the division between operators and 
>> identifiers, as I believe you’re saying here; and (b) should be a non-goal 
>> from the outset. The only task, so far as I can tell, left to do is to 
>> identify what pragmatic set of (mostly mathematical) symbols are used as 
>> operators in the wider world and are likely to be already used in Swift code 
>> or part of common use cases where an operator is clearly superior to 
>> alternative spellings. In my view, the set of valid operator characters not 
>> only shouldn’t require parsing or import directives, but should be small 
>> enough to be knowable by memory.
> 
> The set notation operators should be identifiers, then?
> 
> Set notation operators aren't valid identifier characters; to be clear, the 
> alternative to being a valid operator character would be simply not listing 
> that character among valid operator or identifier characters.
>  
> Because the impression I got from the Set Algebra proposal a few months ago 
> is that there are a lot of people who’ve never even seen those operators, let 
> alone memorized them.
> 
> That's not the impression I got; the argument was that these symbols are hard 
> to type and _not more recognizable that the English text_, which is certainly 
> a plausible argument and the appropriate bar for deciding on a standard 
> library API name.
> 
> MHO is that the bar for a potentially valid operator character _for potential 
> use in third-party APIs_ needn't be so high that we demand the character to 
> be more recognizable to most people than alternative notations. Instead, we 
> can probably justify including a character if it is (a) plausibly useful for 
> some relatively common Swift use case and (b) at least somewhat recognizable 
> for many people. Since set algebra has a well-accepted mathematical notation 
> that's taught (afaict) at the _high school_ level if not earlier, and since 
> set algebra functions are a part of the standard library, that surely meets 
> those bars of usefulness and recognizability.
Maybe they've started teaching it earlier than when I went through school... I 
don't think I learned it until Discrete Math, which IIRC was a 2nd or 3rd year 
course at my college and only required for Math, CS, and maybe EE majors. 
Anyway, WRT a), if Swift achieves its "take over the world" goal, all use cases 
will be Swift use cases. WRT b), "many" as in the numerical quantity or "many" 
as in the percentage? There are probably millions of people who recognize 
calculus's operators, but there are 7.5 billion people in the world.

> Keep in mind that Swift already goes far above and beyond in terms of 
> operators
Yep, that's is a large part of why I'm such a Swift fan :-D

> in that: (a) it allows overloading of almost all standard operators; (b) it 
> permits the definition of effectively an infinite number of custom operators 
> using characters found in standard operators; (c) it permits the definition 
> of custom precedences for custom operators; and (d) it additionally permits 
> the use of a wide number of Unicode characters for custom operators. Most 
> systems programming languages don't even allow (a), let alone (b) or (c). 
> Even dramatically curtailing (d) leaves Swift with an unusually expansive 
> support for custom operators.
Yes, but many of those custom operators won't have a clear meaning because 
operators are rarely limited to pre-existing symbols like "++++++++" (which 
doesn't mean anything at all AFAIK), so operators that are widely known within 
some field probably won't be widely known to the general public, which, IIUC, 
seems to be your standard for inclusion(?). Please let me know if that's not 
your position... I hate being misunderstood probably more than the next person, 
and I wouldn't want to be guilty of that myself.

> What it does conclusively foreclose is something which ought to be stated 
> firmly as a non-goal, which is the typesetting of arbitrary mathematical 
> equations as valid Swift code.
I'm not arguing for adding arbitrary typesetting (not now anyway... maybe in a 
decade or so when more important things have been dealt with). What I am 
arguing for is the ability to treat operators from <pick your field> as 
operators within Swift. As much as possible, anyway.

> Quite simply, Swift is not math; simple addition doesn't even behave as it 
> does in grade-school arithmetic,
Perhaps not for the built-in numeric types, but how do you know somebody won't 
create a type which does behave that way?

> so there is no sense in attempting to shoehorn calculus into the language.
(I'm assuming you mean calculus's syntax, not calculus itself, right?) What's 
the point of having unicode support if half the symbols will get rejected 
because they aren't well-known enough? Sometimes only having token support for 
something can be worse than no support at all from the PoV of someone trying to 
do something that relies on it.



In any case, I only meant to point out a use-case for lookalikes, not spark a 
debate about whether we should support more than a handful of operators... 
shall we consider the idea withdrawn?

- Dave Sweeris
_______________________________________________
swift-evolution mailing list
swift-evolution@swift.org
https://lists.swift.org/mailman/listinfo/swift-evolution

Reply via email to